Position-based Content Attention for Time Series Forecasting with Sequence-to-sequence RNNs

03/29/2017
by   Yagmur G. Cinar, et al.
0

We propose here an extended attention model for sequence-to-sequence recurrent neural networks (RNNs) designed to capture (pseudo-)periods in time series. This extended attention model can be deployed on top of any RNN and is shown to yield state-of-the-art performance for time series forecasting on several univariate and multivariate time series.

READ FULL TEXT

page 8

page 9

page 11

research
05/09/2018

Foundations of Sequence-to-Sequence Modeling for Time Series

The availability of large amounts of time series data, paired with the p...
research
09/10/2017

R2N2: Residual Recurrent Neural Networks for Multivariate Time Series Forecasting

Multivariate time-series modeling and forecasting is an important proble...
research
07/28/2023

A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting

Time series forecasting has received a lot of attention with recurrent n...
research
06/08/2021

RECOWNs: Probabilistic Circuits for Trustworthy Time Series Forecasting

Time series forecasting is a relevant task that is performed in several ...
research
07/19/2021

Topological Attention for Time Series Forecasting

The problem of (point) forecasting univariate time series is considered....
research
05/14/2019

A self-organising eigenspace map for time series clustering

This paper presents a novel time series clustering method, the self-orga...
research
07/07/2019

Fast ES-RNN: A GPU Implementation of the ES-RNN Algorithm

Due to their prevalence, time series forecasting is crucial in multiple ...

Please sign up or login with your details

Forgot password? Click here to reset