Smoothed Contrastive Learning for Unsupervised Sentence Embedding

09/09/2021
by   Xing Wu, et al.
0

Contrastive learning has been gradually applied to learn high-quality unsupervised sentence embedding. Among the previous un-supervised methods, the latest state-of-the-art method, as far as we know, is unsupervised SimCSE (unsup-SimCSE). Unsup-SimCSE uses the InfoNCE1loss function in the training stage by pulling semantically similar sentences together and pushing apart dis-similar ones.Theoretically, we expect to use larger batches in unsup-SimCSE to get more adequate comparisons among samples and avoid overfitting. However, increasing the batch size does not always lead to improvements, but instead even lead to performance degradation when the batch size exceeds a threshold. Through statistical observation, we find that this is probably due to the introduction of low-confidence negative pairs after in-creasing the batch size. To alleviate this problem, we introduce a simple smoothing strategy upon the InfoNCE loss function, termedGaussian Smoothing InfoNCE (GS-InfoNCE).Specifically, we add random Gaussian noise vectors as negative samples, which act asa smoothing of the negative sample space.Though being simple, the proposed smooth-ing strategy brings substantial improvements to unsup-SimCSE. We evaluate GS-InfoNCEon the standard semantic text similarity (STS)task. GS-InfoNCE outperforms the state-of-the-art unsup-SimCSE by an average Spear-man correlation of 1.38 BERT-base, BERT-large,RoBERTa-base and RoBERTa-large, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2022

SNCSE: Contrastive Learning for Unsupervised Sentence Embedding with Soft Negative Samples

Unsupervised sentence embedding aims to obtain the most appropriate embe...
research
09/14/2023

DebCSE: Rethinking Unsupervised Contrastive Sentence Embedding Learning in the Debiasing Perspective

Several prior studies have suggested that word frequency biases can caus...
research
09/09/2021

ESimCSE: Enhanced Sample Building Method for Contrastive Learning of Unsupervised Sentence Embedding

Contrastive learning has been attracting much attention for learning uns...
research
06/30/2020

SCE: Scalable Network Embedding from Sparsest Cut

Large-scale network embedding is to learn a latent representation for ea...
research
02/26/2022

Exploring the Impact of Negative Samples of Contrastive Learning: A Case Study of Sentence Embedding

Contrastive learning is emerging as a powerful technique for extracting ...
research
07/16/2022

Model-Aware Contrastive Learning: Towards Escaping Uniformity-Tolerance Dilemma in Training

Instance discrimination contrastive learning (CL) has achieved significa...
research
09/20/2023

CoT-BERT: Enhancing Unsupervised Sentence Representation through Chain-of-Thought

Unsupervised sentence representation learning aims to transform input se...

Please sign up or login with your details

Forgot password? Click here to reset