DeepAI
Log In Sign Up

Neural Arithmetic Units

01/14/2020
by   Andreas Madsen, et al.
45

Neural networks can approximate complex functions, but they struggle to perform exact arithmetic operations over real numbers. The lack of inductive bias for arithmetic operations leaves neural networks without the underlying logic necessary to extrapolate on tasks such as addition, subtraction, and multiplication. We present two new neural network components: the Neural Addition Unit (NAU), which can learn exact addition and subtraction; and the Neural Multiplication Unit (NMU) that can multiply subsets of a vector. The NMU is, to our knowledge, the first arithmetic neural network component that can learn to multiply elements from a vector, when the hidden size is large. The two new components draw inspiration from a theoretical analysis of recently proposed arithmetic components. We find that careful initialization, restricting parameter space, and regularizing for sparsity is important when optimizing the NAU and NMU. Our proposed units NAU and NMU, compared with previous neural units, converge more consistently, have fewer parameters, learn faster, can converge for larger hidden sizes, obtain sparse and meaningful weights, and can extrapolate to negative and small values.

READ FULL TEXT

page 16

page 24

03/17/2020

iNALU: Improved Neural Arithmetic Logic Unit

Neural networks have to capture mathematical relationships in order to l...
06/02/2020

Neural Power Units

Conventional Neural Networks can approximate simple arithmetic operation...
10/04/2019

Measuring Arithmetic Extrapolation Performance

The Neural Arithmetic Logic Unit (NALU) is a neural network layer that c...
11/10/2022

Improving the Robustness of Neural Multiplication Units with Reversible Stochasticity

Multilayer Perceptrons struggle to learn certain simple arithmetic tasks...
04/15/2020

Neural Status Registers

Neural networks excel at approximating functions and finding patterns in...
03/14/2018

Building Sparse Deep Feedforward Networks using Tree Receptive Fields

Sparse connectivity is an important factor behind the success of convolu...
01/23/2021

A Primer for Neural Arithmetic Logic Modules

Neural Arithmetic Logic Modules have become a growing area of interest, ...

Code Repositories

stable-nalu

Code for Neural Arithmetic Units (ICLR) and Measuring Arithmetic Extrapolation Performance (SEDL|NeurIPS)


view repo