Improving BERT Fine-Tuning via Self-Ensemble and Self-Distillation

02/24/2020
by   Yige Xu, et al.
0

Fine-tuning pre-trained language models like BERT has become an effective way in NLP and yields state-of-the-art results on many downstream tasks. Recent studies on adapting BERT to new tasks mainly focus on modifying the model structure, re-designing the pre-train tasks, and leveraging external data and knowledge. The fine-tuning strategy itself has yet to be fully explored. In this paper, we improve the fine-tuning of BERT with two effective mechanisms: self-ensemble and self-distillation. The experiments on text classification and natural language inference tasks show our proposed methods can significantly improve the adaption of BERT without any external data or knowledge.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2021

Why Can You Lay Off Heads? Investigating How BERT Heads Transfer

The huge size of the widely used BERT family models has led to recent ef...
research
11/22/2021

Finding the Winning Ticket of BERT for Binary Text Classification via Adaptive Layer Truncation before Fine-tuning

In light of the success of transferring language models into NLP tasks, ...
research
08/29/2023

SpikeBERT: A Language Spikformer Trained with Two-Stage Knowledge Distillation from BERT

Spiking neural networks (SNNs) offer a promising avenue to implement dee...
research
04/05/2020

FastBERT: a Self-distilling BERT with Adaptive Inference Time

Pre-trained language models like BERT have proven to be highly performan...
research
12/21/2021

DB-BERT: a Database Tuning Tool that "Reads the Manual"

DB-BERT is a database tuning tool that exploits information gained via n...
research
01/27/2023

Can We Use Probing to Better Understand Fine-tuning and Knowledge Distillation of the BERT NLU?

In this article, we use probing to investigate phenomena that occur duri...
research
10/08/2019

SesameBERT: Attention for Anywhere

Fine-tuning with pre-trained models has achieved exceptional results for...

Please sign up or login with your details

Forgot password? Click here to reset