Self-Supervised Test-Time Learning for Reading Comprehension

03/20/2021
by   Pratyay Banerjee, et al.
8

Recent work on unsupervised question answering has shown that models can be trained with procedurally generated question-answer pairs and can achieve performance competitive with supervised methods. In this work, we consider the task of unsupervised reading comprehension and present a method that performs "test-time learning" (TTL) on a given context (text passage), without requiring training on large-scale human-authored datasets containing context-question-answer triplets. This method operates directly on a single test context, uses self-supervision to train models on synthetically generated question-answer pairs, and then infers answers to unseen human-authored questions for this context. Our method achieves accuracies competitive with fully supervised methods and significantly outperforms current unsupervised methods. TTL methods with a smaller model are also competitive with the current state-of-the-art in unsupervised reading comprehension.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2017

TriviaQA: A Large Scale Distantly Supervised Challenge Dataset for Reading Comprehension

We present TriviaQA, a challenging reading comprehension dataset contain...
research
07/19/2021

Bridging the Gap between Language Model and Reading Comprehension: Unsupervised MRC via Self-Supervision

Despite recent success in machine reading comprehension (MRC), learning ...
research
08/08/2023

Single-Sentence Reader: A Novel Approach for Addressing Answer Position Bias

Machine Reading Comprehension (MRC) models tend to take advantage of spu...
research
04/18/2021

Learning with Instance Bundles for Reading Comprehension

When training most modern reading comprehension models, all the question...
research
06/24/2019

EQuANt (Enhanced Question Answer Network)

Machine Reading Comprehension (MRC) is an important topic in the domain ...
research
01/20/2021

Towards Confident Machine Reading Comprehension

There has been considerable progress on academic benchmarks for the Read...
research
02/15/2020

Undersensitivity in Neural Reading Comprehension

Current reading comprehension models generalise well to in-distribution ...

Please sign up or login with your details

Forgot password? Click here to reset