Cheap and Good? Simple and Effective Data Augmentation for Low Resource Machine Reading

by   Hoang Van, et al.

We propose a simple and effective strategy for data augmentation for low-resource machine reading comprehension (MRC). Our approach first pretrains the answer extraction components of a MRC system on the augmented data that contains approximate context of the correct answers, before training it on the exact answer spans. The approximate context helps the QA method components in narrowing the location of the answers. We demonstrate that our simple strategy substantially improves both document retrieval and answer extraction performance by providing larger context of the answers and additional training data. In particular, our method significantly improves the performance of BERT based retriever (15.12%), and answer extractor (4.33% F1) on TechQA, a complex, low-resource MRC task. Further, our data augmentation strategy yields significant improvements of up to 3.9% exact match (EM) and 2.7% F1 for answer extraction on PolicyQA, another practical but moderate sized QA dataset that also contains long answer spans.



There are no comments yet.


page 1

page 2

page 3

page 4


Data Augmentation by Concatenation for Low-Resource Translation: A Mystery and a Solution

In this paper, we investigate the driving factors behind concatenation, ...

A systematic comparison of methods for low-resource dependency parsing on genuinely low-resource languages

Parsers are available for only a handful of the world's languages, since...

UQuAD1.0: Development of an Urdu Question Answering Training Data for Machine Reading Comprehension

In recent years, low-resource Machine Reading Comprehension (MRC) has ma...

Learning to Rank Question Answer Pairs with Bilateral Contrastive Data Augmentation

In this work, we propose a novel and easy-to-apply data augmentation str...

Composing Answer from Multi-spans for Reading Comprehension

This paper presents a novel method to generate answers for non-extractio...

Exploring Machine Reading Comprehension with Explicit Knowledge

To apply general knowledge to machine reading comprehension (MRC), we pr...

BERTQA – Attention on Steroids

In this work, we extend the Bidirectional Encoder Representations from T...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.