Learning to Execute

10/17/2014
by   Wojciech Zaremba, et al.
0

Recurrent Neural Networks (RNNs) with Long Short-Term Memory units (LSTM) are widely used because they are expressive and are easy to train. Our interest lies in empirically evaluating the expressiveness and the learnability of LSTMs in the sequence-to-sequence regime by training them to evaluate short computer programs, a domain that has traditionally been seen as too complex for neural networks. We consider a simple class of programs that can be evaluated with a single left-to-right pass using constant memory. Our main result is that LSTMs can learn to map the character-level representations of such programs to their correct outputs. Notably, it was necessary to use curriculum learning, and while conventional curriculum learning proved ineffective, we developed a new variant of curriculum learning that improved our networks' performance in all experimental conditions. The improved curriculum had a dramatic impact on an addition problem, making it possible to train an LSTM to add two 9-digit numbers with 99

READ FULL TEXT

page 6

page 7

page 11

research
09/08/2014

Recurrent Neural Network Regularization

We present a simple regularization technique for Recurrent Neural Networ...
research
11/19/2015

Neural Programmer-Interpreters

We propose the neural programmer-interpreter (NPI): a recurrent and comp...
research
11/18/2016

Visualizing and Understanding Curriculum Learning for Long Short-Term Memory Networks

Curriculum Learning emphasizes the order of training instances in a comp...
research
09/12/2017

Parallelizing Linear Recurrent Neural Nets Over Sequence Length

Recurrent neural networks (RNNs) are widely used to model sequential dat...
research
12/13/2022

Can recurrent neural networks learn process model structure?

Various methods using machine and deep learning have been proposed to ta...
research
10/30/2015

Highway Long Short-Term Memory RNNs for Distant Speech Recognition

In this paper, we extend the deep long short-term memory (DLSTM) recurre...
research
07/14/2023

A Quantitative Approach to Predicting Representational Learning and Performance in Neural Networks

A key property of neural networks (both biological and artificial) is ho...

Please sign up or login with your details

Forgot password? Click here to reset