LACoS-BLOOM: Low-rank Adaptation with Contrastive objective on 8 bits Siamese-BLOOM

05/10/2023
by   Wen-Yu Hua, et al.
0

Text embeddings are useful features for several NLP applications, such as sentence similarity, text clustering, and semantic search. In this paper, we present a Low-rank Adaptation with a Contrastive objective on top of 8-bit Siamese-BLOOM, a multilingual large language model optimized to produce semantically meaningful word embeddings. The innovation is threefold. First, we cast BLOOM weights to 8-bit values. Second, we fine-tune BLOOM with a scalable adapter (LoRA) and 8-bit Adam optimizer for sentence similarity classification. Third, we apply a Siamese architecture on BLOOM model with a contrastive objective to ease the multi-lingual labeled data scarcity. The experiment results show the quality of learned embeddings from LACoS-BLOOM is proportional to the number of model parameters and the amount of unlabeled training data. With the parameter efficient fine-tuning design, we are able to run BLOOM 7.1 billion parameters end-to-end on a single GPU machine with 32GB memory. Compared to previous solution Sentence-BERT, we achieve significant improvement on both English and multi-lingual STS tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2021

Sentence Embeddings using Supervised Contrastive Learning

Sentence embeddings encode sentences in fixed dense vectors and have pla...
research
07/03/2020

Language-agnostic BERT Sentence Embedding

We adapt multilingual BERT to produce language-agnostic sentence embeddi...
research
06/15/2016

Siamese CBOW: Optimizing Word Embeddings for Sentence Representations

We present the Siamese Continuous Bag of Words (Siamese CBOW) model, a n...
research
09/15/2019

Emu: Enhancing Multilingual Sentence Embeddings with Semantic Specialization

We present Emu, a system that semantically enhances multilingual sentenc...
research
09/14/2021

conSultantBERT: Fine-tuned Siamese Sentence-BERT for Matching Jobs and Job Seekers

In this paper we focus on constructing useful embeddings of textual info...
research
02/17/2022

SGPT: GPT Sentence Embeddings for Semantic Search

GPT transformers are the largest language models available, yet semantic...
research
04/18/2023

D2CSE: Difference-aware Deep continuous prompts for Contrastive Sentence Embeddings

This paper describes Difference-aware Deep continuous prompt for Contras...

Please sign up or login with your details

Forgot password? Click here to reset