Solving Arithmetic Word Problems with Transformers and Preprocessing of Problem Text

06/02/2021
by   Kaden Griffith, et al.
0

This paper outlines the use of Transformer networks trained to translate math word problems to equivalent arithmetic expressions in infix, prefix, and postfix notations. We compare results produced by many neural configurations and find that most configurations outperform previously reported approaches on three of four datasets with significant increases in accuracy of over 20 percentage points. The best neural approaches boost accuracy by 30 compared to the previous state-of-the-art on some datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

Solving Arithmetic Word Problems Automatically Using Transformer and Unambiguous Representations

Constructing accurate and automatic solvers of math word problems has pr...
research
08/04/2016

Solving General Arithmetic Word Problems

This paper presents a novel approach to automatically solving arithmetic...
research
03/03/2019

Predicting Algorithm Classes for Programming Word Problems

We introduce the task of algorithm class prediction for programming word...
research
10/15/2019

Text2Math: End-to-end Parsing Text into Math Expressions

We propose Text2Math, a model for semantically parsing text into math ex...
research
11/22/2019

TreeGen: A Tree-Based Transformer Architecture for Code Generation

A code generation system generates programming language code based on an...
research
12/10/2019

Arithmetic addition of two integers by deep image classification networks: experiments to quantify their autonomous reasoning ability

The unprecedented performance achieved by deep convolutional neural netw...
research
12/03/2016

Unit Dependency Graph and its Application to Arithmetic Word Problem Solving

Math word problems provide a natural abstraction to a range of natural l...

Please sign up or login with your details

Forgot password? Click here to reset