TCCT: Tightly-Coupled Convolutional Transformer on Time Series Forecasting

08/29/2021
by   Li Shen, et al.
0

Time series forecasting is essential for a wide range of real-world applications. Recent studies have shown the superiority of Transformer in dealing with such problems, especially long sequence time series input(LSTI) and long sequence time series forecasting(LSTF) problems. To improve the efficiency and enhance the locality of Transformer, these studies combine Transformer with CNN in varying degrees. However, their combinations are loosely-coupled and do not make full use of CNN. To address this issue, we propose the concept of tightly-coupled convolutional Transformer(TCCT) and three TCCT architectures which apply transformed CNN architectures into Transformer: (1) CSPAttention: through fusing CSPNet with self-attention mechanism, the computation cost of self-attention mechanism is reduced by 30 and the memory usage is reduced by 50 prediction accuracy. (2) Dilated causal convolution: this method is to modify the distilling operation proposed by Informer through replacing canonical convolutional layers with dilated causal convolutional layers to gain exponentially receptive field growth. (3) Passthrough mechanism: the application of passthrough mechanism to stack of self-attention blocks helps Transformer-like models get more fine-grained information with negligible extra computation costs. Our experiments on real-world datasets show that our TCCT architectures could greatly improve the performance of existing state-of-art Transformer models on time series forecasting with much lower computation and memory costs, including canonical Transformer, LogTrans and Informer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2019

Enhancing the Locality and Breaking the Memory Bottleneck of Transformer on Time Series Forecasting

Time series forecasting is an important problem across many domains, inc...
research
12/14/2020

Informer: Beyond Efficient Transformer for Long Sequence Time-Series Forecasting

Many real-world applications require the prediction of long sequence tim...
research
12/06/2021

Parameter Efficient Deep Probabilistic Forecasting

Probabilistic time series forecasting is crucial in many application dom...
research
01/23/2023

AttMEMO : Accelerating Transformers with Memoization on Big Memory Systems

Transformer models gain popularity because of their superior inference a...
research
07/28/2023

Universal Recurrent Event Memories for Streaming Data

In this paper, we propose a new event memory architecture (MemNet) for r...
research
01/05/2023

Towards Long-Term Time-Series Forecasting: Feature, Pattern, and Distribution

Long-term time-series forecasting (LTTF) has become a pressing demand in...
research
07/14/2022

Rethinking Attention Mechanism in Time Series Classification

Attention-based models have been widely used in many areas, such as comp...

Please sign up or login with your details

Forgot password? Click here to reset