Feed-Forward Networks with Attention Can Solve Some Long-Term Memory Problems

12/29/2015
by   Colin Raffel, et al.
0

We propose a simplified model of attention which is applicable to feed-forward neural networks and demonstrate that the resulting model can solve the synthetic "addition" and "multiplication" long-term memory problems for sequence lengths which are both longer and more widely varying than the best published results for these tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2016

Recurrent Dropout without Memory Loss

This paper presents a novel approach to recurrent neural network (RNN) r...
research
08/05/2020

Working Memory for Online Memory Binding Tasks: A Hybrid Model

Working Memory is the brain module that holds and manipulates informatio...
research
05/12/2016

Direct Method for Training Feed-forward Neural Networks using Batch Extended Kalman Filter for Multi-Step-Ahead Predictions

This paper is dedicated to the long-term, or multi-step-ahead, time seri...
research
06/02/2017

Yeah, Right, Uh-Huh: A Deep Learning Backchannel Predictor

Using supporting backchannel (BC) cues can make human-computer interacti...
research
03/05/2020

Predicting Memory Compiler Performance Outputs using Feed-Forward Neural Networks

Typical semiconductor chips include thousands of mostly small memories. ...
research
03/14/2015

Dynamic Move Tables and Long Branches with Backtracking in Computer Chess

The idea of dynamic move chains has been described in a preceding paper ...

Please sign up or login with your details

Forgot password? Click here to reset