An Attention Free Long Short-Term Memory for Time Series Forecasting

09/20/2022
by   Hugo Inzirillo, et al.
0

Deep learning is playing an increasingly important role in time series analysis. We focused on time series forecasting using attention free mechanism, a more efficient framework, and proposed a new architecture for time series prediction for which linear models seem to be unable to capture the time dependence. We proposed an architecture built using attention free LSTM layers that overcome linear models for conditional variance prediction. Our findings confirm the validity of our model, which also allowed to improve the prediction capacity of a LSTM, while improving the efficiency of the learning task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2020

Time Series Forecasting with Stacked Long Short-Term Memory Networks

Long Short-Term Memory (LSTM) networks are often used to capture tempora...
research
09/10/2019

LSTM-MSNet: Leveraging Forecasts on Sets of Related Time Series with Multiple Seasonal Patterns

Generating forecasts for time series with multiple seasonal cycles is an...
research
12/18/2018

A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series

While LSTMs show increasingly promising results for forecasting Financia...
research
03/10/2023

TSMixer: An all-MLP Architecture for Time Series Forecasting

Real-world time-series datasets are often multivariate with complex dyna...
research
03/30/2020

Difference Attention Based Error Correction LSTM Model for Time Series Prediction

In this paper, we propose a novel model for time series prediction in wh...
research
04/14/2022

EvoSTS Forecasting: Evolutionary Sparse Time-Series Forecasting

In this work, we highlight our novel evolutionary sparse time-series for...
research
12/16/2021

A New Model-free Prediction Method: GA-NoVaS

Volatility forecasting plays an important role in the financial economet...

Please sign up or login with your details

Forgot password? Click here to reset