Towards Long-Term Time-Series Forecasting: Feature, Pattern, and Distribution

01/05/2023
by   Yan Li, et al.
0

Long-term time-series forecasting (LTTF) has become a pressing demand in many applications, such as wind power supply planning. Transformer models have been adopted to deliver high prediction capacity because of the high computational self-attention mechanism. Though one could lower the complexity of Transformers by inducing the sparsity in point-wise self-attentions for LTTF, the limited information utilization prohibits the model from exploring the complex dependencies comprehensively. To this end, we propose an efficient Transformerbased model, named Conformer, which differentiates itself from existing methods for LTTF in three aspects: (i) an encoder-decoder architecture incorporating a linear complexity without sacrificing information utilization is proposed on top of sliding-window attention and Stationary and Instant Recurrent Network (SIRN); (ii) a module derived from the normalizing flow is devised to further improve the information utilization by inferring the outputs with the latent variables in SIRN directly; (iii) the inter-series correlation and temporal dynamics in time-series data are modeled explicitly to fuel the downstream self-attention mechanism. Extensive experiments on seven real-world datasets demonstrate that Conformer outperforms the state-of-the-art methods on LTTF and generates reliable prediction results with uncertainty quantification.

READ FULL TEXT

page 1

page 3

page 12

research
06/24/2021

Autoformer: Decomposition Transformers with Auto-Correlation for Long-Term Series Forecasting

Extending the forecasting time is a critical demand for real application...
research
09/01/2022

Temporal Conditional VAE for Distributional Drift Adaptation in Multivariate Time Series

Due to the nonstationary nature, the distribution of real-world multivar...
research
02/23/2022

Preformer: Predictive Transformer with Multi-Scale Segment-wise Correlations for Long-Term Time Series Forecasting

Transformer-based methods have shown great potential in long-term time s...
research
10/02/2022

Grouped self-attention mechanism for a memory-efficient Transformer

Time-series data analysis is important because numerous real-world tasks...
research
07/12/2022

Split Time Series into Patches: Rethinking Long-term Series Forecasting with Dateformer

Time is one of the most significant characteristics of time-series, yet ...
research
11/06/2018

Modeling and Predicting Popularity Dynamics via Deep Learning Attention Mechanism

An ability to predict the popularity dynamics of individual items within...
research
08/29/2021

TCCT: Tightly-Coupled Convolutional Transformer on Time Series Forecasting

Time series forecasting is essential for a wide range of real-world appl...

Please sign up or login with your details

Forgot password? Click here to reset