Recognizing and Verifying Mathematical Equations using Multiplicative Differential Neural Units

by   Ankur Mali, et al.

Automated mathematical reasoning is a challenging problem that requires an agent to learn algebraic patterns that contain long-range dependencies. Two particular tasks that test this type of reasoning are (1) mathematical equation verification, which requires determining whether trigonometric and linear algebraic statements are valid identities or not, and (2) equation completion, which entails filling in a blank within an expression to make it true. Solving these tasks with deep learning requires that the neural model learn how to manipulate and compose various algebraic symbols, carrying this ability over to previously unseen expressions. Artificial neural networks, including recurrent networks and transformers, struggle to generalize on these kinds of difficult compositional problems, often exhibiting poor extrapolation performance. In contrast, recursive neural networks (recursive-NNs) are, theoretically, capable of achieving better extrapolation due to their tree-like design but are difficult to optimize as the depth of their underlying tree structure increases. To overcome this issue, we extend recursive-NNs to utilize multiplicative, higher-order synaptic connections and, furthermore, to learn to dynamically control and manipulate an external memory. We argue that this key modification gives the neural system the ability to capture powerful transition functions for each possible input. We demonstrate the effectiveness of our proposed higher-order, memory-augmented recursive-NN models on two challenging mathematical equation tasks, showing improved extrapolation, stable performance, and faster convergence. Our models achieve a 1.53 improvement over current state-of-the-art methods in equation verification and achieve a 2.22 equation completion.


page 1

page 2

page 3

page 4


Memory Augmented Recursive Neural Networks

Recursive neural networks have shown an impressive performance for model...

Solving Math Word Problems by Scoring Equations with Recursive Neural Networks

Solving math word problems is a cornerstone task in assessing language u...

Learning Algebraic Recombination for Compositional Generalization

Neural sequence models exhibit limited compositional generalization abil...

Solving Falkner-Skan type equations via Legendre and Chebyshev Neural Blocks

In this paper, a new deep-learning architecture for solving the non-line...

Siamese recurrent networks learn first-order logic reasoning and exhibit zero-shot compositional generalization

Can neural nets learn logic? We approach this classic question with curr...

A Flawed Dataset for Symbolic Equation Verification

Arabshahi, Singh, and Anandkumar (2018) propose a method for creating a ...

ChemAlgebra: Algebraic Reasoning on Chemical Reactions

While showing impressive performance on various kinds of learning tasks,...

Please sign up or login with your details

Forgot password? Click here to reset