Deep Learning for Symbolic Mathematics

12/02/2019
by   Guillaume Lample, et al.
0

Neural networks have a reputation for being better at solving statistical or approximate problems than at performing calculations or working with symbolic data. In this paper, we show that they can be surprisingly good at more elaborated tasks in mathematics, such as symbolic integration and solving differential equations. We propose a syntax for representing mathematical problems, and methods for generating large datasets that can be used to train sequence-to-sequence models. We achieve results that outperform commercial Computer Algebra Systems such as Matlab or Mathematica.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2021

Pretrained Language Models are Symbolic Mathematics Solvers too!

Solving symbolic mathematics has always been of in the arena of human in...
research
11/18/2021

Method for representing an exponent in a fifth-dimensional hypercomplex number systems using a hypercomplex computing software

The structure of method for constructing a representation of an exponent...
research
12/12/2019

The Use of Deep Learning for Symbolic Integration: A Review of (Lample and Charton, 2019)

Lample and Charton (2019) describe a system that uses deep learning tech...
research
12/03/2021

Linear algebra with transformers

Most applications of transformers to mathematics, from integration to th...
research
09/28/2021

Symbolic Brittleness in Sequence Models: on Systematic Generalization in Symbolic Mathematics

Neural sequence models trained with maximum likelihood estimation have l...
research
06/01/2015

Blocks and Fuel: Frameworks for deep learning

We introduce two Python frameworks to train neural networks on large dat...
research
08/14/2022

Limits of an AI program for solving college math problems

Drori et al. (2022) report that "A neural network solves, explains, and ...

Please sign up or login with your details

Forgot password? Click here to reset