TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting
In recent years, Transformer-based models have achieved good results in the analysis and application of time series. In particular, the introduction of Autoformer has further improved the performance of the model in long-term sequence prediction. However, Transformer-based models, such as Autoformer...
| Published in: | IEEE Access |
|---|---|
| Main Authors: | , , |
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10156810/ |
| _version_ | 1849451229084647424 |
|---|---|
| author | Jiayi Fan Bingyao Wang Dong Bian |
| author_facet | Jiayi Fan Bingyao Wang Dong Bian |
| author_sort | Jiayi Fan |
| collection | DOAJ |
| container_title | IEEE Access |
| description | In recent years, Transformer-based models have achieved good results in the analysis and application of time series. In particular, the introduction of Autoformer has further improved the performance of the model in long-term sequence prediction. However, Transformer-based models, such as Autoformer, have not fully considered the local temporal features of the sequence, and have not addressed the impact of sequence anomalies on decomposition and the processing of trend terms. To address these issues, we combined the excellent performance of the time convolutional neural network (TCN) on time series data and the advantages of the STL inner-outer loop decomposition to design the TEDformer, a Transformer prediction model enhanced with global and local temporal features. The model decomposes the time series into trend and periodic terms using STL and extracts temporal features accordingly. We conducted experiments on six real-world datasets, and the results showed that our model improved by 10.8% on multivariate datasets and 15.7% on univariate datasets compared to state-of-the-art models. |
| format | Article |
| id | doaj-art-63247b42ae0f4e2daca2aea98cd8ef71 |
| institution | Directory of Open Access Journals |
| issn | 2169-3536 |
| language | English |
| publishDate | 2025-01-01 |
| publisher | IEEE |
| record_format | Article |
| spelling | doaj-art-63247b42ae0f4e2daca2aea98cd8ef712025-08-20T03:27:36ZengIEEEIEEE Access2169-35362025-01-011312082112082910.1109/ACCESS.2023.328789310156810TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series ForecastingJiayi Fan0https://orcid.org/0000-0001-7606-9213Bingyao Wang1Dong Bian2School of Computer Science and Technology, Qingdao University, Qingdao, ChinaShandong Association of Artificial Intelligence, Jinan, ChinaSchool of Microelectronics, Shandong University, Jinan, ChinaIn recent years, Transformer-based models have achieved good results in the analysis and application of time series. In particular, the introduction of Autoformer has further improved the performance of the model in long-term sequence prediction. However, Transformer-based models, such as Autoformer, have not fully considered the local temporal features of the sequence, and have not addressed the impact of sequence anomalies on decomposition and the processing of trend terms. To address these issues, we combined the excellent performance of the time convolutional neural network (TCN) on time series data and the advantages of the STL inner-outer loop decomposition to design the TEDformer, a Transformer prediction model enhanced with global and local temporal features. The model decomposes the time series into trend and periodic terms using STL and extracts temporal features accordingly. We conducted experiments on six real-world datasets, and the results showed that our model improved by 10.8% on multivariate datasets and 15.7% on univariate datasets compared to state-of-the-art models.https://ieeexplore.ieee.org/document/10156810/Time series forecastingautoformertemporal convolutional neural networkstransformer |
| spellingShingle | Jiayi Fan Bingyao Wang Dong Bian TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting Time series forecasting autoformer temporal convolutional neural networks transformer |
| title | TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting |
| title_full | TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting |
| title_fullStr | TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting |
| title_full_unstemmed | TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting |
| title_short | TEDformer: Temporal Feature Enhanced Decomposed Transformer for Long-Term Series Forecasting |
| title_sort | tedformer temporal feature enhanced decomposed transformer for long term series forecasting |
| topic | Time series forecasting autoformer temporal convolutional neural networks transformer |
| url | https://ieeexplore.ieee.org/document/10156810/ |
| work_keys_str_mv | AT jiayifan tedformertemporalfeatureenhanceddecomposedtransformerforlongtermseriesforecasting AT bingyaowang tedformertemporalfeatureenhanceddecomposedtransformerforlongtermseriesforecasting AT dongbian tedformertemporalfeatureenhanceddecomposedtransformerforlongtermseriesforecasting |
