DeepAI AI Chat
Log In Sign Up

Giving BERT a Calculator: Finding Operations and Arguments with Reading Comprehension

by   Daniel Andor, et al.

Reading comprehension models have been successfully applied to extractive text answers, but it is unclear how best to generalize these models to abstractive numerical answers. We enable a BERT-based reading comprehension model to perform lightweight numerical reasoning. We augment the model with a predefined set of executable 'programs' which encompass simple arithmetic as well as extraction. Rather than having to learn to manipulate numbers directly, the model can pick a program and execute it. On the recent Discrete Reasoning Over Passages (DROP) dataset, designed to challenge reading comprehension models, we show a 33 model can learn to predict new operations when appropriate in a math word problem setting (Roy and Roth, 2015) with very few training examples.


page 1

page 2

page 3

page 4


Numerical reasoning in machine reading comprehension tasks: are we there yet?

Numerical reasoning based machine reading comprehension is a task that i...

NumNet: Machine Reading Comprehension with Numerical Reasoning

Numerical reasoning, such as addition, subtraction, sorting and counting...

NT5?! Training T5 to Perform Numerical Reasoning

Numerical reasoning over text (NRoT) presents unique challenges that are...

Robust Reading Comprehension with Linguistic Constraints via Posterior Regularization

In spite of great advancements of machine reading comprehension (RC), ex...

Discrete Reasoning Templates for Natural Language Understanding

Reasoning about information from multiple parts of a passage to derive a...

Feature-augmented Machine Reading Comprehension with Auxiliary Tasks

While most successful approaches for machine reading comprehension rely ...

What does BERT Learn from Multiple-Choice Reading Comprehension Datasets?

Multiple-Choice Reading Comprehension (MCRC) requires the model to read ...