ConSERT: A Contrastive Framework for Self-Supervised Sentence Representation Transfer

05/25/2021
by   Yuanmeng Yan, et al.
0

Learning high-quality sentence representations benefits a wide range of natural language processing tasks. Though BERT-based pre-trained language models achieve high performance on many downstream tasks, the native derived sentence representations are proved to be collapsed and thus produce a poor performance on the semantic textual similarity (STS) tasks. In this paper, we present ConSERT, a Contrastive Framework for Self-Supervised Sentence Representation Transfer, that adopts contrastive learning to fine-tune BERT in an unsupervised and effective way. By making use of unlabeled texts, ConSERT solves the collapse issue of BERT-derived sentence representations and make them more applicable for downstream tasks. Experiments on STS datasets demonstrate that ConSERT achieves an 8% relative improvement over the previous state-of-the-art, even comparable to the supervised SBERT-NLI. And when further incorporating NLI supervision, we achieve new state-of-the-art performance on STS tasks. Moreover, ConSERT obtains comparable results with only 1000 samples available, showing its robustness in data scarcity scenarios.

READ FULL TEXT
research
04/20/2022

Generative or Contrastive? Phrase Reconstruction for Better Sentence Representation Learning

Though offering amazing contextualized token-level representations, curr...
research
09/26/2019

ALBERT: A Lite BERT for Self-supervised Learning of Language Representations

Increasing model size when pretraining natural language representations ...
research
10/03/2022

ContraGen: Effective Contrastive Learning For Causal Language Model

Despite exciting progress in large-scale language generation, the expres...
research
07/31/2023

Scaling Sentence Embeddings with Large Language Models

Large language models (LLMs) have recently garnered significant interest...
research
06/05/2020

DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations

We present DeCLUTR: Deep Contrastive Learning for Unsupervised Textual R...
research
10/16/2021

Virtual Augmentation Supported Contrastive Learning of Sentence Representations

Despite profound successes, contrastive representation learning relies o...
research
09/12/2021

Pairwise Supervised Contrastive Learning of Sentence Representations

Many recent successes in sentence representation learning have been achi...

Please sign up or login with your details

Forgot password? Click here to reset