Does Long-Term Series Forecasting Need Complex Attention and Extra Long Inputs?

06/08/2023
by   Daojun Liang, et al.
0

As Transformer-based models have achieved impressive performance on various time series tasks, Long-Term Series Forecasting (LTSF) tasks have also received extensive attention in recent years. However, due to the inherent computational complexity and long sequences demanding of Transformer-based methods, its application on LTSF tasks still has two major issues that need to be further investigated: 1) Whether the sparse attention mechanism designed by these methods actually reduce the running time on real devices; 2) Whether these models need extra long input sequences to guarantee their performance? The answers given in this paper are negative. Therefore, to better copy with these two issues, we design a lightweight Period-Attention mechanism (Periodformer), which renovates the aggregation of long-term subseries via explicit periodicity and short-term subseries via built-in proximity. Meanwhile, a gating mechanism is embedded into Periodformer to regulate the influence of the attention module on the prediction results. Furthermore, to take full advantage of GPUs for fast hyperparameter optimization (e.g., finding the suitable input length), a Multi-GPU Asynchronous parallel algorithm based on Bayesian Optimization (MABO) is presented. MABO allocates a process to each GPU via a queue mechanism, and then creates multiple trials at a time for asynchronous parallel search, which greatly reduces the search time. Compared with the state-of-the-art methods, the prediction error of Periodformer reduced by 13 and univariate forecasting, respectively. In addition, MABO reduces the average search time by 46 paper indicates that LTSF may not need complex attention and extra long input sequences. The source code will be open source on Github.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2021

Tsformer: Time series Transformer for tourism demand forecasting

AI-based methods have been widely applied to tourism demand forecasting....
research
07/19/2021

Long-term series forecasting with Query Selector – efficient model of sparse attention

Various modifications of TRANSFORMER were recently used to solve time-se...
research
07/28/2021

Long-term series forecasting with Query Selector

Various modifications of TRANSFORMER were recently used to solve time-se...
research
07/27/2023

HUTFormer: Hierarchical U-Net Transformer for Long-Term Traffic Forecasting

Traffic forecasting, which aims to predict traffic conditions based on h...
research
02/23/2022

Preformer: Predictive Transformer with Multi-Scale Segment-wise Correlations for Long-Term Time Series Forecasting

Transformer-based methods have shown great potential in long-term time s...
research
08/22/2023

SegRNN: Segment Recurrent Neural Network for Long-Term Time Series Forecasting

RNN-based methods have faced challenges in the Long-term Time Series For...
research
05/24/2022

FreDo: Frequency Domain-based Long-Term Time Series Forecasting

The ability to forecast far into the future is highly beneficial to many...

Please sign up or login with your details

Forgot password? Click here to reset