DeepAI AI Chat
Log In Sign Up

Measuring Arithmetic Extrapolation Performance

by   Andreas Madsen, et al.

The Neural Arithmetic Logic Unit (NALU) is a neural network layer that can learn exact arithmetic operations between the elements of a hidden state. The goal of NALU is to learn perfect extrapolation, which requires learning the exact underlying logic of an unknown arithmetic problem. Evaluating the performance of the NALU is non-trivial as one arithmetic problem might have many solutions. As a consequence, single-instance MSE has been used to evaluate and compare performance between models. However, it can be hard to interpret what magnitude of MSE represents a correct solution and models sensitivity to initialization. We propose using a success-criterion to measure if and when a model converges. Using a success-criterion we can summarize success-rate over many initialization seeds and calculate confidence intervals. We contribute a generalized version of the previous arithmetic benchmark to measure models sensitivity under different conditions. This is, to our knowledge, the first extensive evaluation with respect to convergence of the NALU and its sub-units. Using a success-criterion to summarize 4800 experiments we find that consistently learning arithmetic extrapolation is challenging, in particular for multiplication.


page 1

page 2

page 3

page 4


Neural Arithmetic Units

Neural networks can approximate complex functions, but they struggle to ...

iNALU: Improved Neural Arithmetic Logic Unit

Neural networks have to capture mathematical relationships in order to l...

A Primer for Neural Arithmetic Logic Modules

Neural Arithmetic Logic Modules have become a growing area of interest, ...

Neural Arithmetic Logic Units

Neural networks can learn to represent and manipulate numerical informat...

Improving the Robustness of Neural Multiplication Units with Reversible Stochasticity

Multilayer Perceptrons struggle to learn certain simple arithmetic tasks...

Learning Division with Neural Arithmetic Logic Modules

To achieve systematic generalisation, it first makes sense to master sim...

Simulating Problem Difficulty in Arithmetic Cognition Through Dynamic Connectionist Models

The present study aims to investigate similarities between how humans an...

Code Repositories


Code for Neural Arithmetic Units (ICLR) and Measuring Arithmetic Extrapolation Performance (SEDL|NeurIPS)

view repo