SNCSE: Contrastive Learning for Unsupervised Sentence Embedding with Soft Negative Samples

01/16/2022
by   Hao Wang, et al.
2

Unsupervised sentence embedding aims to obtain the most appropriate embedding for a sentence to reflect its semantic. Contrastive learning has been attracting developing attention. For a sentence, current models utilize diverse data augmentation methods to generate positive samples, while consider other independent sentences as negative samples. Then they adopt InfoNCE loss to pull the embeddings of positive pairs gathered, and push those of negative pairs scattered. Although these models have made great progress on sentence embedding, we argue that they may suffer from feature suppression. The models fail to distinguish and decouple textual similarity and semantic similarity. And they may overestimate the semantic similarity of any pairs with similar textual regardless of the actual semantic difference between them. This is because positive pairs in unsupervised contrastive learning come with similar and even the same textual through data augmentation. To alleviate feature suppression, we propose contrastive learning for unsupervised sentence embedding with soft negative samples (SNCSE). Soft negative samples share highly similar textual but have surely and apparently different semantic with the original samples. Specifically, we take the negation of original sentences as soft negative samples, and propose Bidirectional Margin Loss (BML) to introduce them into traditional contrastive learning framework, which merely involves positive and negative samples. Our experimental results show that SNCSE can obtain state-of-the-art performance on semantic textual similarity (STS) task with average Spearman's correlation coefficient of 78.97 BERTbase and 79.23 method to detect the weakness of SNCSE for future study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2022

Contrastive Learning with Prompt-derived Virtual Semantic Prototypes for Unsupervised Sentence Embedding

Contrastive learning has become a new paradigm for unsupervised sentence...
research
09/14/2023

DebCSE: Rethinking Unsupervised Contrastive Sentence Embedding Learning in the Debiasing Perspective

Several prior studies have suggested that word frequency biases can caus...
research
09/22/2022

An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning

Unsupervised sentence embeddings learning has been recently dominated by...
research
07/20/2023

Identical and Fraternal Twins: Fine-Grained Semantic Contrastive Learning of Sentence Representations

The enhancement of unsupervised learning of sentence representations has...
research
09/09/2021

Smoothed Contrastive Learning for Unsupervised Sentence Embedding

Contrastive learning has been gradually applied to learn high-quality un...
research
05/28/2023

Whitening-based Contrastive Learning of Sentence Embeddings

This paper presents a whitening-based contrastive learning method for se...
research
10/08/2022

SDA: Simple Discrete Augmentation for Contrastive Sentence Representation Learning

Contrastive learning methods achieve state-of-the-art results in unsuper...

Please sign up or login with your details

Forgot password? Click here to reset