D2CSE: Difference-aware Deep continuous prompts for Contrastive Sentence Embeddings

04/18/2023
by   Hyunjae Lee, et al.
0

This paper describes Difference-aware Deep continuous prompt for Contrastive Sentence Embeddings (D2CSE) that learns sentence embeddings. Compared to state-of-the-art approaches, D2CSE computes sentence vectors that are exceptional to distinguish a subtle difference in similar sentences by employing a simple neural architecture for continuous prompts. Unlike existing architectures that require multiple pretrained language models (PLMs) to process a pair of the original and corrupted (subtly modified) sentences, D2CSE avoids cumbersome fine-tuning of multiple PLMs by only optimizing continuous prompts by performing multiple tasks – i.e., contrastive learning and conditional replaced token detection all done in a self-guided manner. D2CSE overloads a single PLM on continuous prompts and greatly saves memory consumption as a result. The number of training parameters in D2CSE is reduced to about 1% of existing approaches while substantially improving the quality of sentence embeddings. We evaluate D2CSE on seven Semantic Textual Similarity (STS) benchmarks, using three different metrics, namely, Spearman's rank correlation, recall@K for a retrieval task, and the anisotropy of an embedding space measured in alignment and uniformity. Our empirical results suggest that shallow (not too meticulously devised) continuous prompts can be honed effectively for multiple NLP tasks and lead to improvements upon existing state-of-the-art approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2021

Sentence Embeddings using Supervised Contrastive Learning

Sentence embeddings encode sentences in fixed dense vectors and have pla...
research
05/24/2023

Bridging Continuous and Discrete Spaces: Interpretable Sentence Representation Learning via Compositional Operations

Traditional sentence embedding models encode sentences into vector repre...
research
07/14/2023

Composition-contrastive Learning for Sentence Embeddings

Vector representations of natural language are ubiquitous in search appl...
research
03/14/2022

Deep Continuous Prompt for Contrastive Learning of Sentence Embeddings

The performance of sentence representation has been remarkably improved ...
research
07/31/2023

Scaling Sentence Embeddings with Large Language Models

Large language models (LLMs) have recently garnered significant interest...
research
02/26/2022

Toward Interpretable Semantic Textual Similarity via Optimal Transport-based Contrastive Sentence Learning

Recently, finetuning a pretrained language model to capture the similari...
research
05/10/2023

LACoS-BLOOM: Low-rank Adaptation with Contrastive objective on 8 bits Siamese-BLOOM

Text embeddings are useful features for several NLP applications, such a...

Please sign up or login with your details

Forgot password? Click here to reset