Leveraging Multi-lingual Positive Instances in Contrastive Learning to Improve Sentence Embedding

09/16/2023
by   Kaiyan Zhao, et al.
0

Learning multi-lingual sentence embeddings is a fundamental and significant task in natural language processing. Recent trends of learning both mono-lingual and multi-lingual sentence embeddings are mainly based on contrastive learning (CL) with an anchor, one positive, and multiple negative instances. In this work, we argue that leveraging multiple positives should be considered for multi-lingual sentence embeddings because (1) positives in a diverse set of languages can benefit cross-lingual learning, and (2) transitive similarity across multiple positives can provide reliable structural information to learn. In order to investigate the impact of CL with multiple positives, we propose a novel approach MPCL to effectively utilize multiple positive instances to improve learning multi-lingual sentence embeddings. Our experimental results on various backbone models and downstream tasks support that compared with conventional CL, MPCL leads to better retrieval, semantic similarity, and classification performances. We also observe that on unseen languages, sentence embedding models trained on multiple positives have better cross-lingual transferring performance than models trained on a single positive instance.

READ FULL TEXT
research
11/11/2022

English Contrastive Learning Can Learn Universal Cross-lingual Sentence Embeddings

Universal cross-lingual sentence embeddings map semantically similar cro...
research
02/26/2022

Multi-Level Contrastive Learning for Cross-Lingual Alignment

Cross-language pre-trained models such as multilingual BERT (mBERT) have...
research
01/28/2022

PCL: Peer-Contrastive Learning with Diverse Augmentations for Unsupervised Sentence Embeddings

Learning sentence embeddings in an unsupervised manner is fundamental in...
research
05/28/2021

Lightweight Cross-Lingual Sentence Representation Learning

Large-scale models for learning fixed-dimensional cross-lingual sentence...
research
04/11/2019

Scalable Cross-Lingual Transfer of Neural Sentence Embeddings

We develop and investigate several cross-lingual alignment approaches fo...
research
03/30/2022

Auto-MLM: Improved Contrastive Learning for Self-supervised Multi-lingual Knowledge Retrieval

Contrastive learning (CL) has become a ubiquitous approach for several n...
research
10/30/2021

TransAug: Translate as Augmentation for Sentence Embeddings

While contrastive learning greatly advances the representation of senten...

Please sign up or login with your details

Forgot password? Click here to reset