Extensions and Limitations of the Neural GPU

11/02/2016
by   Eric Price, et al.
0

The Neural GPU is a recent model that can learn algorithms such as multi-digit binary addition and binary multiplication in a way that generalizes to inputs of arbitrary length. We show that there are two simple ways of improving the performance of the Neural GPU: by carefully designing a curriculum, and by increasing model size. The latter requires a memory efficient implementation, as a naive implementation of the Neural GPU is memory intensive. We find that these techniques increase the set of algorithmic problems that can be solved by the Neural GPU: we have been able to learn to perform all the arithmetic operations (and generalize to arbitrarily long numbers) when the arguments are given in the decimal representation (which, surprisingly, has not been possible before). We have also been able to train the Neural GPU to evaluate long arithmetic expressions with multiple operands that require respecting the precedence order of the operands, although these have succeeded only in their binary representation, and not with perfect accuracy. In addition, we gain insight into the Neural GPU by investigating its failure modes. We find that Neural GPUs that correctly generalize to arbitrarily long numbers still fail to compute the correct answer on highly-symmetric, atypical inputs: for example, a Neural GPU that achieves near-perfect generalization on decimal multiplication of up to 100-digit long numbers can fail on 000000...002 × 000000...002 while succeeding at 2 × 2. These failure modes are reminiscent of adversarial examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2015

Neural GPUs Learn Algorithms

Learning an algorithm from examples is a fundamental problem that has be...
research
06/15/2020

Neural Execution Engines: Learning to Execute Subroutines

A significant effort has been made to train neural networks that replica...
research
02/08/2016

Binarized Neural Networks

We introduce a method to train Binarized Neural Networks (BNNs) - neural...
research
06/27/2023

Length Generalization in Arithmetic Transformers

We examine how transformers cope with two challenges: learning basic int...
research
01/06/2021

On long arithmetic progressions in binary Morse-like words

We present results on the existence of long arithmetic progressions in t...
research
11/10/2022

Improving the Robustness of Neural Multiplication Units with Reversible Stochasticity

Multilayer Perceptrons struggle to learn certain simple arithmetic tasks...
research
02/28/2017

Improving the Neural GPU Architecture for Algorithm Learning

Algorithm learning is a core problem in artificial intelligence with sig...

Please sign up or login with your details

Forgot password? Click here to reset