A Comparison of LSTMs and Attention Mechanisms for Forecasting Financial Time Series

12/18/2018
by   Thomas Hollis, et al.
0

While LSTMs show increasingly promising results for forecasting Financial Time Series (FTS), this paper seeks to assess if attention mechanisms can further improve performance. The hypothesis is that attention can help prevent long-term dependencies experienced by LSTM models. To test this hypothesis, the main contribution of this paper is the implementation of an LSTM with attention. Both the benchmark LSTM and the LSTM with attention were compared and both achieved reasonable performances of up to 60 Kaggle's Two Sigma dataset. This comparative analysis demonstrates that an LSTM with attention can indeed outperform standalone LSTMs but further investigation is required as issues do arise with such model architectures.

READ FULL TEXT
research
11/09/2018

EA-LSTM: Evolutionary Attention-based LSTM for Time Series Prediction

Time series prediction with deep learning methods, especially long short...
research
09/20/2022

An Attention Free Long Short-Term Memory for Time Series Forecasting

Deep learning is playing an increasingly important role in time series a...
research
09/23/2021

Deep Learning with Kernel Flow Regularization for Time Series Forecasting

Long Short-Term Memory (LSTM) neural networks have been widely used for ...
research
11/11/2019

Making Good on LSTMs Unfulfilled Promise

LSTMs promise much to financial time-series analysis, temporal and cross...
research
11/16/2016

A Way out of the Odyssey: Analyzing and Combining Recent Insights for LSTMs

LSTMs have become a basic building block for many deep NLP models. In re...
research
09/02/2020

Application of LSTM architectures for next frame forecasting in Sentinel-1 images time series

L'analyse prédictive permet d'estimer les tendances des évènements futur...

Please sign up or login with your details

Forgot password? Click here to reset