Unifying Question Answering and Text Classification via Span Extraction

04/19/2019
by   Nitish Shirish Keskar, et al.
0

Even as pre-trained language encoders such as BERT are shared across many tasks, the output layers of question answering and text classification models are significantly different. Span decoders are frequently used for question answering and fixed-class, classification layers for text classification. We show that this distinction is not necessary, and that both can be unified as span extraction. A unified, span-extraction approach leads to superior or comparable performance in multi-task learning, low-data and supplementary supervised pretraining experiments on several text classification and question answering benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2022

Understanding BLOOM: An empirical study on diverse NLP tasks

In this work, we present an evaluation of smaller BLOOM model variants (...
research
06/07/2023

Enhancing In-Context Learning with Answer Feedback for Multi-Span Question Answering

Whereas the recent emergence of large language models (LLMs) like ChatGP...
research
04/07/2020

Transformers to Learn Hierarchical Contexts in Multiparty Dialogue for Span-based Question Answering

We introduce a novel approach to transformers that learns hierarchical r...
research
07/11/2022

Embedding Recycling for Language Models

Training and inference with large neural models is expensive. However, f...
research
12/16/2021

UniREx: A Unified Learning Framework for Language Model Rationale Extraction

An extractive rationale explains a language model's (LM's) prediction on...
research
09/22/2021

K-AID: Enhancing Pre-trained Language Models with Domain Knowledge for Question Answering

Knowledge enhanced pre-trained language models (K-PLMs) are shown to be ...
research
08/28/2020

Rethinking the objectives of extractive question answering

This paper describes two generally applicable approaches towards the sig...

Please sign up or login with your details

Forgot password? Click here to reset