A Critical Review of Recurrent Neural Networks for Sequence Learning

05/29/2015
by   Zachary C Lipton, et al.
0

Countless learning tasks require dealing with sequential data. Image captioning, speech synthesis, and music generation all require that a model produce outputs that are sequences. In other domains, such as time series prediction, video analysis, and musical information retrieval, a model must learn from inputs that are sequences. Interactive tasks, such as translating natural language, engaging in dialogue, and controlling a robot, often demand both capabilities. Recurrent neural networks (RNNs) are connectionist models that capture the dynamics of sequences via cycles in the network of nodes. Unlike standard feedforward neural networks, recurrent networks retain a state that can represent information from an arbitrarily long context window. Although recurrent neural networks have traditionally been difficult to train, and often contain millions of parameters, recent advances in network architectures, optimization techniques, and parallel computation have enabled successful large-scale learning with them. In recent years, systems based on long short-term memory (LSTM) and bidirectional (BRNN) architectures have demonstrated ground-breaking performance on tasks as varied as image captioning, language translation, and handwriting recognition. In this survey, we review and synthesize the research that over the past three decades first yielded and then made practical these powerful learning models. When appropriate, we reconcile conflicting notation and nomenclature. Our goal is to provide a self-contained explication of the state of the art together with a historical perspective and references to primary research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2019

Neural Image Captioning

In recent years, the biggest advances in major Computer Vision tasks, su...
research
07/08/2018

Learning The Sequential Temporal Information with Recurrent Neural Networks

Recurrent Networks are one of the most powerful and promising artificial...
research
08/24/2017

Learning the Enigma with Recurrent Neural Networks

Recurrent neural networks (RNNs) represent the state of the art in trans...
research
12/20/2018

Graph Neural Networks: A Review of Methods and Applications

Lots of learning tasks require dealing with graph data which contains ri...
research
11/24/2017

Convolutional Image Captioning

Image captioning is an important but challenging task, applicable to vir...
research
06/09/2015

Scheduled Sampling for Sequence Prediction with Recurrent Neural Networks

Recurrent Neural Networks can be trained to produce sequences of tokens ...
research
09/29/2018

Auto-conditioned Recurrent Mixture Density Networks for Complex Trajectory Generation

Recent advancements in machine learning research have given rise to recu...

Please sign up or login with your details

Forgot password? Click here to reset