Answering Science Exam Questions Using Query Rewriting with Background Knowledge

09/15/2018
by   Ryan Musa, et al.
0

Open-domain question answering (QA) is an important problem in AI and NLP that is emerging as a bellwether for progress on the generalizability of AI methods and techniques. Much of the progress in open-domain QA systems has been realized through advances in information retrieval methods and corpus construction. In this paper, we focus on the recently introduced ARC Challenge dataset, which contains 2,590 multiple choice questions authored for grade-school science exams. These questions are selected to be the most challenging for current QA systems, and current state of the art performance is only slightly better than random chance. We present a system that rewrites a given question into queries that are used to retrieve supporting text from a large corpus of science-related text. Our rewriter is able to incorporate background knowledge from ConceptNet and -- in tandem with a generic textual entailment system trained on SciTail that identifies support in the retrieved results -- outperforms several strong baselines on the end-to-end QA task despite only being trained to identify essential terms in the original source question. We use a generalizable decision methodology over the retrieved evidence and answer candidates to select the best answer. By combining query rewriting, background knowledge, and textual entailment our system is able to outperform several strong baselines on the ARC dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/06/2020

Interpretable Multi-Step Reasoning with Knowledge Extraction on Complex Healthcare Question Answering

Healthcare question answering assistance aims to provide customer health...
research
01/23/2019

A Question-Entailment Approach to Question Answering

One of the challenges in large-scale information retrieval (IR) is to de...
research
10/20/2020

Open Question Answering over Tables and Text

In open question answering (QA), the answer to a question is produced by...
research
12/28/2021

The University of Texas at Dallas HLTRI's Participation in EPIC-QA: Searching for Entailed Questions Revealing Novel Answer Nuggets

The Epidemic Question Answering (EPIC-QA) track at the Text Analysis Con...
research
08/28/2018

Learning to Attend On Essential Terms: An Enhanced Retriever-Reader Model for Scientific Question Answering

Scientific Question Answering (SQA) is a challenging open-domain task wh...
research
08/31/2021

When Retriever-Reader Meets Scenario-Based Multiple-Choice Questions

Scenario-based question answering (SQA) requires retrieving and reading ...
research
02/13/2016

Science Question Answering using Instructional Materials

We provide a solution for elementary science test using instructional ma...

Please sign up or login with your details

Forgot password? Click here to reset