Reasoning-Driven Question-Answering for Natural Language Understanding

08/14/2019
by   Daniel Khashabi, et al.
4

Natural language understanding (NLU) of text is a fundamental challenge in AI, and it has received significant attention throughout the history of NLP research. This primary goal has been studied under different tasks, such as Question Answering (QA) and Textual Entailment (TE). In this thesis, we investigate the NLU problem through the QA task and focus on the aspects that make it a challenge for the current state-of-the-art technology. This thesis is organized into three main parts: In the first part, we explore multiple formalisms to improve existing machine comprehension systems. We propose a formulation for abductive reasoning in natural language and show its effectiveness, especially in domains with limited training data. Additionally, to help reasoning systems cope with irrelevant or redundant information, we create a supervised approach to learn and detect the essential terms in questions. In the second part, we propose two new challenge datasets. In particular, we create two datasets of natural language questions where (i) the first one requires reasoning over multiple sentences; (ii) the second one requires temporal common sense reasoning. We hope that the two proposed datasets will motivate the field to address more complex problems. In the final part, we present the first formal framework for multi-step reasoning algorithms, in the presence of a few important properties of language use, such as incompleteness, ambiguity, etc. We apply this framework to prove fundamental limitations for reasoning algorithms. These theoretical results provide extra intuition into the existing empirical evidence in the field.

READ FULL TEXT

page 28

page 31

research
10/17/2022

ReasonChainQA: Text-based Complex Question Answering with Explainable Evidence Chains

The ability of reasoning over evidence has received increasing attention...
research
06/14/2019

IITP at MEDIQA 2019: Systems Report for Natural Language Inference, Question Entailment and Question Answering

This paper presents the experiments accomplished as a part of our partic...
research
01/08/2019

On the Capabilities and Limitations of Reasoning for Natural Language Understanding

Recent systems for natural language understanding are strong at overcomi...
research
11/07/2018

Compositional Language Understanding with Text-based Relational Reasoning

Neural networks for natural language reasoning have largely focused on e...
research
05/17/2021

Factoring Statutory Reasoning as Language Understanding Challenges

Statutory reasoning is the task of determining whether a legal statute, ...
research
05/11/2020

A Dataset for Statutory Reasoning in Tax Law Entailment and Question Answering

Legislation can be viewed as a body of prescriptive rules expressed in n...
research
05/06/2021

A Generative Symbolic Model for More General Natural Language Understanding and Reasoning

We present a new fully-symbolic Bayesian model of semantic parsing and r...

Please sign up or login with your details

Forgot password? Click here to reset