A Dual-Stage Attention-Based Recurrent Neural Network for Time Series Prediction

04/07/2017
by   Yao Qin, et al.
0

The Nonlinear autoregressive exogenous (NARX) model, which predicts the current value of a time series based upon its previous values as well as the current and past values of multiple driving (exogenous) series, has been studied for decades. Despite the fact that various NARX models have been developed, few of them can capture the long-term temporal dependencies appropriately and select the relevant driving series to make predictions. In this paper, we propose a dual-stage attention-based recurrent neural network (DA-RNN) to address these two issues. In the first stage, we introduce an input attention mechanism to adaptively extract relevant driving series (a.k.a., input features) at each time step by referring to the previous encoder hidden state. In the second stage, we use a temporal attention mechanism to select relevant encoder hidden states across all time steps. With this dual-stage attention scheme, our model can not only make predictions effectively, but can also be easily interpreted. Thorough empirical studies based upon the SML 2010 dataset and the NASDAQ 100 Stock dataset demonstrate that the DA-RNN can outperform state-of-the-art methods for time series prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2019

DSTP-RNN: a dual-stage two-phase attention-based recurrent neural networks for long-term and multivariate time series prediction

Long-term prediction of multivariate time series is still an important b...
research
06/02/2018

Hierarchical Attention-Based Recurrent Highway Networks for Time Series Prediction

Time series prediction has been studied in a variety of domains. However...
research
04/13/2020

Hybrid Attention Networks for Flow and Pressure Forecasting in Water Distribution Systems

Multivariate geo-sensory time series prediction is challenging because o...
research
06/17/2018

Multi-variable LSTM neural network for autoregressive exogenous model

In this paper, we propose multi-variable LSTM capable of accurate foreca...
research
06/22/2018

Focusing on What is Relevant: Time-Series Learning and Understanding using Attention

This paper is a contribution towards interpretability of the deep learni...
research
08/26/2023

Multivariate time series classification with dual attention network

One of the topics in machine learning that is becoming more and more rel...
research
09/12/2017

RRA: Recurrent Residual Attention for Sequence Learning

In this paper, we propose a recurrent neural network (RNN) with residual...

Please sign up or login with your details

Forgot password? Click here to reset