Choose Your QA Model Wisely: A Systematic Study of Generative and Extractive Readers for Question Answering

03/14/2022
by   Man Luo, et al.
4

While both extractive and generative readers have been successfully applied to the Question Answering (QA) task, little attention has been paid toward the systematic comparison of them. Characterizing the strengths and weaknesses of the two readers is crucial not only for making a more informed reader selection in practice but also for developing a deeper understanding to foster further research on improving readers in a principled manner. Motivated by this goal, we make the first attempt to systematically study the comparison of extractive and generative readers for question answering. To be aligned with the state-of-the-art, we explore nine transformer-based large pre-trained language models (PrLMs) as backbone architectures. Furthermore, we organize our findings under two main categories: (1) keeping the architecture invariant, and (2) varying the underlying PrLMs. Among several interesting findings, it is important to highlight that (1) the generative readers perform better in long context QA, (2) the extractive readers perform better in short context while also showing better out-of-domain generalization, and (3) the encoder of encoder-decoder PrLMs (e.g., T5) turns out to be a strong extractive reader and outperforms the standard choice of encoder-only PrLMs (e.g., RoBERTa). We also study the effect of multi-task learning on the two types of readers varying the underlying PrLMs and perform qualitative and quantitative diagnosis to provide further insights into future directions in modeling better readers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2022

Applying Multilingual Models to Question Answering (QA)

We study the performance of monolingual and multilingual language models...
research
10/12/2021

Attention-guided Generative Models for Extractive Question Answering

We propose a novel method for applying Transformer models to extractive ...
research
10/21/2019

Domain-agnostic Question-Answering with Adversarial Training

Adapting models to new domain without finetuning is a challenging proble...
research
01/11/2021

A Neural Question Answering System for Basic Questions about Subroutines

A question answering (QA) system is a type of conversational AI that gen...
research
09/04/2017

A Unified Query-based Generative Model for Question Generation and Question Answering

We propose a query-based generative model for solving both tasks of ques...
research
04/13/2021

Structural analysis of an all-purpose question answering model

Attention is a key component of the now ubiquitous pre-trained language ...

Please sign up or login with your details

Forgot password? Click here to reset