Probabilistic Assumptions Matter: Improved Models for Distantly-Supervised Document-Level Question Answering

05/05/2020
by   Hao Cheng, et al.
0

We address the problem of extractive question answering using document-level distant super-vision, pairing questions and relevant documents with answer strings. We compare previously used probability space and distant super-vision assumptions (assumptions on the correspondence between the weak answer string labels and possible answer mention spans). We show that these assumptions interact, and that different configurations provide complementary benefits. We demonstrate that a multi-objective model can efficiently combine the advantages of multiple assumptions and out-perform the best individual formulation. Our approach outperforms previous state-of-the-art models by 4.3 points in F1 on TriviaQA-Wiki and 1.7 points in Rouge-L on NarrativeQA summaries.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/05/2016

Dynamic Coattention Networks For Question Answering

Several deep learning models have been proposed for question answering. ...
01/01/2021

UnitedQA: A Hybrid Approach for Open Domain Question Answering

To date, most of recent work under the retrieval-reader framework for op...
04/27/2021

Document Collection Visual Question Answering

Current tasks and methods in Document Understanding aims to process docu...
04/04/2019

Guiding Extractive Summarization with Question-Answering Rewards

Highlighting while reading is a natural behavior for people to track sal...
04/30/2020

RikiNet: Reading Wikipedia Pages for Natural Question Answering

Reading long documents to answer open-domain questions remains challengi...
10/10/2021

Distantly-Supervised Evidence Retrieval Enables Question Answering without Evidence Annotation

Open-domain question answering answers a question based on evidence retr...

Please sign up or login with your details

Forgot password? Click here to reset