How to Fine-Tune BERT for Text Classification?

05/14/2019
by   Chi Sun, et al.
0

Language model pre-training has proven to be useful in learning universal language representations. As a state-of-the-art language model pre-training model, BERT (Bidirectional Encoder Representations from Transformers) has achieved amazing results in many language understanding tasks. In this paper, we conduct exhaustive experiments to investigate different fine-tuning methods of BERT on text classification task and provide a general solution for BERT fine-tuning. Finally, the proposed solution obtains new state-of-the-art results on eight widely-studied text classification datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2021

From Universal Language Model to Downstream Task: Improving RoBERTa-Based Vietnamese Hate Speech Detection

Natural language processing is a fast-growing field of artificial intell...
research
06/15/2023

MetricPrompt: Prompting Model as a Relevance Metric for Few-shot Text Classification

Prompting methods have shown impressive performance in a variety of text...
research
04/14/2022

Label Semantic Aware Pre-training for Few-shot Text Classification

In text classification tasks, useful information is encoded in the label...
research
05/21/2020

Text-to-Text Pre-Training for Data-to-Text Tasks

We study the pre-train + fine-tune strategy for data-to-text tasks. Fine...
research
09/13/2022

CNN-Trans-Enc: A CNN-Enhanced Transformer-Encoder On Top Of Static BERT representations for Document Classification

BERT achieves remarkable results in text classification tasks, it is yet...
research
05/03/2021

Goldilocks: Just-Right Tuning of BERT for Technology-Assisted Review

Technology-assisted review (TAR) refers to iterative active learning wor...
research
09/26/2019

Pre-train, Interact, Fine-tune: A Novel Interaction Representation for Text Classification

Text representation can aid machines in understanding text. Previous wor...

Please sign up or login with your details

Forgot password? Click here to reset