First De-Trend then Attend: Rethinking Attention for Time-Series Forecasting

12/15/2022
by   Xiyuan Zhang, et al.
0

Transformer-based models have gained large popularity and demonstrated promising results in long-term time-series forecasting in recent years. In addition to learning attention in time domain, recent works also explore learning attention in frequency domains (e.g., Fourier domain, wavelet domain), given that seasonal patterns can be better captured in these domains. In this work, we seek to understand the relationships between attention models in different time and frequency domains. Theoretically, we show that attention models in different domains are equivalent under linear conditions (i.e., linear kernel to attention scores). Empirically, we analyze how attention models of different domains show different behaviors through various synthetic experiments with seasonality, trend and noise, with emphasis on the role of softmax operation therein. Both these theoretical and empirical analyses motivate us to propose a new method: TDformer (Trend Decomposition Transformer), that first applies seasonal-trend decomposition, and then additively combines an MLP which predicts the trend component with Fourier attention which predicts the seasonal component to obtain the final prediction. Extensive experiments on benchmark time-series forecasting datasets demonstrate that TDformer achieves state-of-the-art performance against existing attention-based models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2022

STD: A Seasonal-Trend-Dispersion Decomposition of Time Series

The decomposition of a time series is an essential task that helps to un...
research
08/25/2023

TFDNet: Time-Frequency Enhanced Decomposed Network for Long-term Time Series Forecasting

Long-term time series forecasting is a vital task and has a wide range o...
research
01/24/2021

Multi-Task Time Series Forecasting With Shared Attention

Time series forecasting is a key component in many industrial and busine...
research
07/19/2021

Long-term series forecasting with Query Selector – efficient model of sparse attention

Various modifications of TRANSFORMER were recently used to solve time-se...
research
06/30/2023

Improving the Transferability of Time Series Forecasting with Decomposition Adaptation

Due to effective pattern mining and feature representation, neural forec...
research
08/09/2023

PETformer: Long-term Time Series Forecasting via Placeholder-enhanced Transformer

Recently, Transformer-based models have shown remarkable performance in ...
research
08/24/2015

Echoes of Persuasion: The Effect of Euphony in Persuasive Communication

While the effect of various lexical, syntactic, semantic and stylistic f...

Please sign up or login with your details

Forgot password? Click here to reset