DRAformer: Differentially Reconstructed Attention Transformer for Time-Series Forecasting

06/11/2022
by   Benhan Li, et al.
0

Time-series forecasting plays an important role in many real-world scenarios, such as equipment life cycle forecasting, weather forecasting, and traffic flow forecasting. It can be observed from recent research that a variety of transformer-based models have shown remarkable results in time-series forecasting. However, there are still some issues that limit the ability of transformer-based models on time-series forecasting tasks: (i) learning directly on raw data is susceptible to noise due to its complex and unstable feature representation; (ii) the self-attention mechanisms pay insufficient attention to changing features and temporal dependencies. In order to solve these two problems, we propose a transformer-based differentially reconstructed attention model DRAformer. Specifically, DRAformer has the following innovations: (i) learning against differenced sequences, which preserves clear and stable sequence features by differencing and highlights the changing properties of sequences; (ii) the reconstructed attention: integrated distance attention exhibits sequential distance through a learnable Gaussian kernel, distributed difference attention calculates distribution difference by mapping the difference sequence to the adaptive feature space, and the combination of the two effectively focuses on the sequences with prominent associations; (iii) the reconstructed decoder input, which extracts sequence features by integrating variation information and temporal correlations, thereby obtaining a more comprehensive sequence representation. Extensive experiments on four large-scale datasets demonstrate that DRAformer outperforms state-of-the-art baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/14/2020

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

Many real-world applications require the prediction of long sequence tim...
research
02/23/2022

A Differential Attention Fusion Model Based on Transformer for Time Series Forecasting

Time series forecasting is widely used in the fields of equipment life c...
research
03/14/2023

FPTN: Fast Pure Transformer Network for Traffic Flow Forecasting

Traffic flow forecasting is challenging due to the intricate spatio-temp...
research
04/14/2022

Joint Forecasting of Panoptic Segmentations with Difference Attention

Forecasting of a representation is important for safe and effective auto...
research
01/24/2021

Multi-Task Time Series Forecasting With Shared Attention

Time series forecasting is a key component in many industrial and busine...
research
05/24/2023

A Joint Time-frequency Domain Transformer for Multivariate Time Series Forecasting

To enhance predicting performance while minimizing computational demands...
research
07/16/2022

Generalizable Memory-driven Transformer for Multivariate Long Sequence Time-series Forecasting

Multivariate long sequence time-series forecasting (M-LSTF) is a practic...

Please sign up or login with your details

Forgot password? Click here to reset