DeepAI AI Chat
Log In Sign Up

ETSformer: Exponential Smoothing Transformers for Time-series Forecasting

by   Gerald Woo, et al.

Transformers have been actively studied for time-series forecasting in recent years. While often showing promising results in various scenarios, traditional Transformers are not designed to fully exploit the characteristics of time-series data and thus suffer some fundamental limitations, e.g., they generally lack of decomposition capability and interpretability, and are neither effective nor efficient for long-term forecasting. In this paper, we propose ETSFormer, a novel time-series Transformer architecture, which exploits the principle of exponential smoothing in improving Transformers for time-series forecasting. In particular, inspired by the classical exponential smoothing methods in time-series forecasting, we propose the novel exponential smoothing attention (ESA) and frequency attention (FA) to replace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency. Based on these, we redesign the Transformer architecture with modular decomposition blocks such that it can learn to decompose the time-series data into interpretable time-series components such as level, growth and seasonality. Extensive experiments on various time-series benchmarks validate the efficacy and advantages of the proposed method. The code and models of our implementations will be released.


page 1

page 2

page 3

page 4


The DeepCAR Method: Forecasting Time-Series Data That Have Change Points

Many methods for time-series forecasting are known in classical statisti...

Scaleformer: Iterative Multi-scale Refining Transformers for Time Series Forecasting

The performance of time series forecasting has recently been greatly imp...

Peak Detection On Data Independent Acquisition Mass Spectrometry Data With Semisupervised Convolutional Transformers

Liquid Chromatography coupled to Mass Spectrometry (LC-MS) based methods...

Transformers in Time Series: A Survey

Transformers have achieved superior performances in many tasks in natura...

FormerTime: Hierarchical Multi-Scale Representations for Multivariate Time Series Classification

Deep learning-based algorithms, e.g., convolutional networks, have signi...

Transformers predicting the future. Applying attention in next-frame and time series forecasting

Recurrent Neural Networks were, until recently, one of the best ways to ...