Long-term series forecasting with Query Selector – efficient model of sparse attention

07/19/2021
by   Jacek Klimek, et al.
1

Various modifications of TRANSFORMER were recently used to solve time-series forecasting problem. We propose Query Selector - an efficient, deterministic algorithm for sparse attention matrix. Experiments show it achieves state-of-the art results on ETT data set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/28/2021

Long-term series forecasting with Query Selector

Various modifications of TRANSFORMER were recently used to solve time-se...
research
01/16/2023

TDSTF: Transformer-based Diffusion probabilistic model for Sparse Time series Forecasting

Time series probabilistic forecasting with multi-dimensional and sporadi...
research
02/23/2022

A Differential Attention Fusion Model Based on Transformer for Time Series Forecasting

Time series forecasting is widely used in the fields of equipment life c...
research
06/08/2023

Does Long-Term Series Forecasting Need Complex Attention and Extra Long Inputs?

As Transformer-based models have achieved impressive performance on vari...
research
12/15/2022

First De-Trend then Attend: Rethinking Attention for Time-Series Forecasting

Transformer-based models have gained large popularity and demonstrated p...
research
08/09/2023

PETformer: Long-term Time Series Forecasting via Placeholder-enhanced Transformer

Recently, Transformer-based models have shown remarkable performance in ...
research
03/18/2018

Aggregating Strategies for Long-term Forecasting

The article is devoted to investigating the application of aggregating a...

Please sign up or login with your details

Forgot password? Click here to reset