Identical and Fraternal Twins: Fine-Grained Semantic Contrastive Learning of Sentence Representations

07/20/2023
by   Qingfa Xiao, et al.
0

The enhancement of unsupervised learning of sentence representations has been significantly achieved by the utility of contrastive learning. This approach clusters the augmented positive instance with the anchor instance to create a desired embedding space. However, relying solely on the contrastive objective can result in sub-optimal outcomes due to its inability to differentiate subtle semantic variations between positive pairs. Specifically, common data augmentation techniques frequently introduce semantic distortion, leading to a semantic margin between the positive pair. While the InfoNCE loss function overlooks the semantic margin and prioritizes similarity maximization between positive pairs during training, leading to the insensitive semantic comprehension ability of the trained model. In this paper, we introduce a novel Identical and Fraternal Twins of Contrastive Learning (named IFTCL) framework, capable of simultaneously adapting to various positive pairs generated by different augmentation techniques. We propose a Twins Loss to preserve the innate margin during training and promote the potential of data enhancement in order to overcome the sub-optimal issue. We also present proof-of-concept experiments combined with the contrastive objective to prove the validity of the proposed Twins Loss. Furthermore, we propose a hippocampus queue mechanism to restore and reuse the negative instances without additional calculation, which further enhances the efficiency and performance of the IFCL. We verify the IFCL framework on nine semantic textual similarity tasks with both English and Chinese datasets, and the experimental results show that IFCL outperforms state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2022

SNCSE: Contrastive Learning for Unsupervised Sentence Embedding with Soft Negative Samples

Unsupervised sentence embedding aims to obtain the most appropriate embe...
research
11/07/2022

Contrastive Learning with Prompt-derived Virtual Semantic Prototypes for Unsupervised Sentence Embedding

Contrastive learning has become a new paradigm for unsupervised sentence...
research
09/22/2022

An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning

Unsupervised sentence embeddings learning has been recently dominated by...
research
10/30/2021

TransAug: Translate as Augmentation for Sentence Embeddings

While contrastive learning greatly advances the representation of senten...
research
09/13/2023

Instance Adaptive Prototypical Contrastive Embedding for Generalized Zero Shot Learning

Generalized zero-shot learning(GZSL) aims to classify samples from seen ...
research
05/11/2020

Prototypical Contrastive Learning of Unsupervised Representations

This paper presents Prototypical Contrastive Learning (PCL), an unsuperv...
research
01/19/2023

Semantic-aware Contrastive Learning for More Accurate Semantic Parsing

Since the meaning representations are detailed and accurate annotations ...

Please sign up or login with your details

Forgot password? Click here to reset