Few-Shot Question Answering by Pretraining Span Selection

01/02/2021
by   Ori Ram, et al.
0

In a number of question answering (QA) benchmarks, pretrained models have reached human parity through fine-tuning on an order of 100,000 annotated questions and answers. We explore the more realistic few-shot setting, where only a few hundred training examples are available. We show that standard span selection models perform poorly, highlighting the fact that current pretraining objective are far removed from question answering. To address this, we propose a new pretraining scheme that is more suitable for extractive question answering. Given a passage with multiple sets of recurring spans, we mask in each set all recurring spans but one, and ask the model to select the correct span in the passage for each masked span. Masked spans are replaced with a special token, viewed as a question representation, that is later used during fine-tuning to select the answer span. The resulting model obtains surprisingly good results on multiple benchmarks, e.g., 72.7 F1 with only 128 examples on SQuAD, while maintaining competitive (and sometimes better) performance in the high-resource setting. Our findings indicate that careful design of pretraining schemes and model architecture can have a dramatic effect on performance in the few-shot settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/04/2021

FewshotQA: A simple framework for few-shot learning of question answering tasks using pre-trained text-to-text models

The task of learning from only a few examples (called a few-shot setting...
research
08/12/2021

How Optimal is Greedy Decoding for Extractive Question Answering?

Fine-tuned language models use greedy decoding to answer reading compreh...
research
03/13/2023

Generating multiple-choice questions for medical question answering with distractors and cue-masking

Medical multiple-choice question answering (MCQA) is particularly diffic...
research
03/22/2023

Salient Span Masking for Temporal Understanding

Salient Span Masking (SSM) has shown itself to be an effective strategy ...
research
08/28/2020

Rethinking the objectives of extractive question answering

This paper describes two generally applicable approaches towards the sig...
research
09/09/2022

Activity report analysis with automatic single or multispan answer extraction

In the era of loT (Internet of Things) we are surrounded by a plethora o...
research
04/15/2021

Sequence Tagging for Biomedical Extractive Question Answering

Current studies in extractive question answering (EQA) have modeled sing...

Please sign up or login with your details

Forgot password? Click here to reset