Unsupervised Question Answering by Cloze Translation

06/12/2019
by   Patrick Lewis, et al.
9

Obtaining training data for Question Answering (QA) is time-consuming and resource-intensive, and existing QA datasets are only available for limited domains and languages. In this work, we explore to what extent high quality training data is actually required for Extractive QA, and investigate the possibility of unsupervised Extractive QA. We approach this problem by first learning to generate context, question and answer triples in an unsupervised manner, which we then use to synthesize Extractive QA training data automatically. To generate such triples, we first sample random context paragraphs from a large corpus of documents and then random noun phrases or named entity mentions from these paragraphs as answers. Next we convert answers in context to "fill-in-the-blank" cloze questions and finally translate them into natural questions. We propose and compare various unsupervised ways to perform cloze-to-natural question translation, including training an unsupervised NMT model using non-aligned corpora of natural questions and cloze questions as well as a rule-based approach. We find that modern QA models can learn to answer human questions surprisingly well using only synthetic training data. We demonstrate that, without using the SQuAD training data at all, our approach achieves 56.4 F1 on SQuAD v1 (64.5 F1 when the answer is a Named entity mention), outperforming early supervised models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2020

Unsupervised Multi-hop Question Answering by Question Generation

Obtaining training data for Multi-hop Question Answering (QA) is extreme...
research
09/19/2023

QASnowball: An Iterative Bootstrapping Framework for High-Quality Question-Answering Data Generation

Recent years have witnessed the success of question answering (QA), espe...
research
05/06/2020

Harvesting and Refining Question-Answer Pairs for Unsupervised QA

Question Answering (QA) has shown great success thanks to the availabili...
research
01/24/2022

Unified Question Generation with Continual Lifelong Learning

Question Generation (QG), as a challenging Natural Language Processing t...
research
10/04/2020

When in Doubt, Ask: Generating Answerable and Unanswerable Questions, Unsupervised

Question Answering (QA) is key for making possible a robust communicatio...
research
08/23/2022

Unsupervised Question Answering via Answer Diversifying

Unsupervised question answering is an attractive task due to its indepen...
research
11/19/2019

Unsupervised Natural Question Answering with a Small Model

The recent (2019-02) demonstration of the power of huge language models ...

Please sign up or login with your details

Forgot password? Click here to reset