ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

02/03/2022
by   Gerald Woo, et al.
0

Transformers have been actively studied for time-series forecasting in recent years. While often showing promising results in various scenarios, traditional Transformers are not designed to fully exploit the characteristics of time-series data and thus suffer some fundamental limitations, e.g., they generally lack of decomposition capability and interpretability, and are neither effective nor efficient for long-term forecasting. In this paper, we propose ETSFormer, a novel time-series Transformer architecture, which exploits the principle of exponential smoothing in improving Transformers for time-series forecasting. In particular, inspired by the classical exponential smoothing methods in time-series forecasting, we propose the novel exponential smoothing attention (ESA) and frequency attention (FA) to replace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency. Based on these, we redesign the Transformer architecture with modular decomposition blocks such that it can learn to decompose the time-series data into interpretable time-series components such as level, growth and seasonality. Extensive experiments on various time-series benchmarks validate the efficacy and advantages of the proposed method. The code and models of our implementations will be released.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2023

The DeepCAR Method: Forecasting Time-Series Data That Have Change Points

Many methods for time-series forecasting are known in classical statisti...
research
06/24/2021

Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting

Extending the forecasting time is a critical demand for real application...
research
02/15/2022

Transformers in Time Series: A Survey

Transformers have achieved superior performances in many tasks in natura...
research
02/20/2023

FormerTime: Hierarchical Multi-Scale Representations for Multivariate Time Series Classification

Deep learning-based algorithms, e.g., convolutional networks, have signi...
research
08/18/2021

Transformers predicting the future. Applying attention in next-frame and time series forecasting

Recurrent Neural Networks were, until recently, one of the best ways to ...
research
06/19/2023

Transformer Training Strategies for Forecasting Multiple Load Time Series

Recent work uses Transformers for load forecasting, which are the state ...

Please sign up or login with your details

Forgot password? Click here to reset