MCSE: Multimodal Contrastive Learning of Sentence Embeddings

04/22/2022
by   Miaoran Zhang, et al.
0

Learning semantically meaningful sentence embeddings is an open problem in natural language processing. In this work, we propose a sentence embedding learning approach that exploits both visual and textual information via a multimodal contrastive objective. Through experiments on a variety of semantic textual similarity tasks, we demonstrate that our approach consistently improves the performance across various datasets and pre-trained encoders. In particular, combining a small amount of multimodal data with a large text-only corpus, we improve the state-of-the-art average Spearman's correlation by 1.7 By analyzing the properties of the textual embedding space, we show that our model excels in aligning semantically similar sentences, providing an explanation for its improved performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2022

Non-Linguistic Supervision for Contrastive Learning of Sentence Embeddings

Semantic representation learning for sentences is an important and well-...
research
07/15/2023

AspectCSE: Sentence Embeddings for Aspect-based Semantic Textual Similarity using Contrastive Learning and Structured Knowledge

Generic sentence embeddings provide a coarse-grained approximation of se...
research
07/14/2023

Composition-contrastive Learning for Sentence Embeddings

Vector representations of natural language are ubiquitous in search appl...
research
02/22/2021

Probing Multimodal Embeddings for Linguistic Properties: the Visual-Semantic Case

Semantic embeddings have advanced the state of the art for countless nat...
research
12/23/2018

Improving Context-Aware Semantic Relationships in Sparse Mobile Datasets

Traditional semantic similarity models often fail to encapsulate the ext...
research
08/29/2022

Reweighting Strategy based on Synthetic Data Identification for Sentence Similarity

Semantically meaningful sentence embeddings are important for numerous t...
research
07/06/2023

LEA: Improving Sentence Similarity Robustness to Typos Using Lexical Attention Bias

Textual noise, such as typos or abbreviations, is a well-known issue tha...

Please sign up or login with your details

Forgot password? Click here to reset