Pretrained Language Models are Symbolic Mathematics Solvers too!

10/07/2021
by   Modar Sulaiman, et al.
51

Solving symbolic mathematics has always been of in the arena of human ingenuity that needs compositional reasoning and recurrence. However, recent studies have shown that large-scale language models such as transformers are universal and surprisingly can be trained as a sequence-to-sequence task to solve complex mathematical equations. These large transformer models need humongous amounts of training data to generalize to unseen symbolic mathematics problems. In this paper, we present a sample efficient way of solving the symbolic tasks by first pretraining the transformer model with language translation and then fine-tuning the pretrained transformer model to solve the downstream task of symbolic mathematics. We achieve comparable accuracy on the integration task with our pretrained model while using around 1.5 orders of magnitude less number of training samples with respect to the state-of-the-art deep learning for symbolic mathematics. The test accuracy on differential equation tasks is considerably lower comparing with integration as they need higher order recursions that are not present in language translations. We pretrain our model with different pairs of language translations. Our results show language bias in solving symbolic mathematics tasks. Finally, we study the robustness of the fine-tuned model on symbolic math tasks against distribution shift, and our approach generalizes better in distribution shift scenarios for the function integration.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

Deep Learning for Symbolic Mathematics

Neural networks have a reputation for being better at solving statistica...
research
10/19/2021

Generating Symbolic Reasoning Problems with Transformer GANs

Constructing training data for symbolic reasoning domains is challenging...
research
05/21/2023

A Symbolic Framework for Systematic Evaluation of Mathematical Reasoning with Transformers

Whether Transformers can learn to apply symbolic rules and generalise to...
research
09/28/2021

Symbolic Brittleness in Sequence Models: on Systematic Generalization in Symbolic Mathematics

Neural sequence models trained with maximum likelihood estimation have l...
research
03/05/2021

Measuring Mathematical Problem Solving With the MATH Dataset

Many intellectual endeavors require mathematical problem solving, but th...
research
03/09/2021

Symbolic integration by integrating learning models with different strengths and weaknesses

Integration is indispensable, not only in mathematics, but also in a wid...
research
12/05/2018

Attending to Mathematical Language with Transformers

Mathematical expressions were generated, evaluated and used to train neu...

Please sign up or login with your details

Forgot password? Click here to reset