Position-based Content Attention for Time Series Forecasting with Sequence-to-sequence RNNs

03/29/2017 ∙ by Yagmur G. Cinar, et al. ∙ 0

We propose here an extended attention model for sequence-to-sequence recurrent neural networks (RNNs) designed to capture (pseudo-)periods in time series. This extended attention model can be deployed on top of any RNN and is shown to yield state-of-the-art performance for time series forecasting on several univariate and multivariate time series.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 8

page 9

page 11

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.