DeepAI AI Chat
Log In Sign Up

A Primer for Neural Arithmetic Logic Modules

by   Bhumika Mistry, et al.

Neural Arithmetic Logic Modules have become a growing area of interest, though remain a niche field. These units are small neural networks which aim to achieve systematic generalisation in learning arithmetic operations such as +, -, *, } while also being interpretive in their weights. This paper is the first in discussing the current state of progress of this field, explaining key works, starting with the Neural Arithmetic Logic Unit (NALU). Focusing on the shortcomings of NALU, we provide an in-depth analysis to reason about design choices of recent units. A cross-comparison between units is made on experiment setups and findings, where we highlight inconsistencies in a fundamental experiment causing the inability to directly compare across papers. We finish by providing a novel discussion of existing applications for NALU and research directions requiring further exploration.


page 1

page 2

page 3

page 4


Neural Power Units

Conventional Neural Networks can approximate simple arithmetic operation...

iNALU: Improved Neural Arithmetic Logic Unit

Neural networks have to capture mathematical relationships in order to l...

Learning Division with Neural Arithmetic Logic Modules

To achieve systematic generalisation, it first makes sense to master sim...

Measuring Arithmetic Extrapolation Performance

The Neural Arithmetic Logic Unit (NALU) is a neural network layer that c...

Neural Arithmetic Units

Neural networks can approximate complex functions, but they struggle to ...

Applicability of Partial Ternary Full Adder in Ternary Arithmetic Units

This paper explores whether or not a complete ternary full adder, whose ...

Improving the Robustness of Neural Multiplication Units with Reversible Stochasticity

Multilayer Perceptrons struggle to learn certain simple arithmetic tasks...