DeepAI AI Chat
Log In Sign Up

Neural Power Units

by   Niklas Heim, et al.

Conventional Neural Networks can approximate simple arithmetic operations, but fail to generalize beyond the range of numbers that were seen during training. Neural Arithmetic Units aim to overcome this difficulty, but current arithmetic units are either limited to operate on positive numbers or can only represent a subset of arithmetic operations. We introduce the Neural Power Unit (NPU) that operates on the full domain of real numbers and is capable of learning arbitrary power functions in a single layer. The NPU thus fixes the shortcomings of existing arithmetic units and extends their expressivity. We achieve this by using complex arithmetic without requiring a conversion of the network to complex numbers. A simplification of the unit to the RealNPU yields a highly interpretable model. We show that the NPUs outperform their competitors in terms of accuracy and sparsity on artificial arithmetic datasets, and that the RealNPU can discover the governing equations of a dynamical systems only from data.


page 5

page 7


Neural Arithmetic Units

Neural networks can approximate complex functions, but they struggle to ...

A Primer for Neural Arithmetic Logic Modules

Neural Arithmetic Logic Modules have become a growing area of interest, ...

Neural Arithmetic Logic Units

Neural networks can learn to represent and manipulate numerical informat...

Learning Division with Neural Arithmetic Logic Modules

To achieve systematic generalisation, it first makes sense to master sim...

Neural Status Registers

Neural networks excel at approximating functions and finding patterns in...

iNALU: Improved Neural Arithmetic Logic Unit

Neural networks have to capture mathematical relationships in order to l...

Applicability of Partial Ternary Full Adder in Ternary Arithmetic Units

This paper explores whether or not a complete ternary full adder, whose ...