BERTVision – A Parameter-Efficient Approach for Question Answering

02/24/2022
by   Siduo Jiang, et al.
0

We present a highly parameter efficient approach for Question Answering that significantly reduces the need for extended BERT fine-tuning. Our method uses information from the hidden state activations of each BERT transformer layer, which is discarded during typical BERT inference. Our best model achieves maximal BERT performance at a fraction of the training time and GPU or TPU expense. Performance is further improved by ensembling our model with BERTs predictions. Furthermore, we find that near optimal performance can be achieved for QA span annotation using less training data. Our experiments show that this approach works well not only for span annotation, but also for classification, suggesting that it may be extensible to a wider range of tasks.

READ FULL TEXT

page 3

page 11

research
09/26/2021

Improving Question Answering Performance Using Knowledge Distillation and Active Learning

Contemporary question answering (QA) systems, including transformer-base...
research
11/07/2019

Blockwise Self-Attention for Long Document Understanding

We present BlockBERT, a lightweight and efficient BERT model that is des...
research
02/25/2020

Exploring BERT Parameter Efficiency on the Stanford Question Answering Dataset v2.0

In this paper we explore the parameter efficiency of BERT arXiv:1810.048...
research
08/19/2019

A Study of BERT for Non-Factoid Question-Answering under Passage Length Constraints

We study the use of BERT for non-factoid question-answering, focusing on...
research
07/24/2019

SpanBERT: Improving Pre-training by Representing and Predicting Spans

We present SpanBERT, a pre-training method that is designed to better re...
research
05/17/2020

Context-Based Quotation Recommendation

While composing a new document, anything from a news article to an email...
research
04/14/2019

Data Augmentation for BERT Fine-Tuning in Open-Domain Question Answering

Recently, a simple combination of passage retrieval using off-the-shelf ...

Please sign up or login with your details

Forgot password? Click here to reset