JCSE: Contrastive Learning of Japanese Sentence Embeddings and Its Applications

01/19/2023
by   Zihao Chen, et al.
0

Contrastive learning is widely used for sentence representation learning. Despite this prevalence, most studies have focused exclusively on English and few concern domain adaptation for domain-specific downstream tasks, especially for low-resource languages like Japanese, which are characterized by insufficient target domain data and the lack of a proper training strategy. To overcome this, we propose a novel Japanese sentence representation framework, JCSE (derived from “Contrastive learning of Sentence Embeddings for Japanese”), that creates training data by generating sentences and synthesizing them with sentences available in a target domain. Specifically, a pre-trained data generator is finetuned to a target domain using our collected corpus. It is then used to generate contradictory sentence pairs that are used in contrastive learning for adapting a Japanese language model to a specific task in the target domain. Another problem of Japanese sentence representation learning is the difficulty of evaluating existing embedding methods due to the lack of benchmark datasets. Thus, we establish a comprehensive Japanese Semantic Textual Similarity (STS) benchmark on which various embedding models are evaluated. Based on this benchmark result, multiple embedding methods are chosen and compared with JCSE on two domain-specific tasks, STS in a clinical domain and information retrieval in an educational domain. The results show that JCSE achieves significant performance improvement surpassing direct transfer and other training strategies. This empirically demonstrates JCSE's effectiveness and practicability for downstream tasks of a low-resource language.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2023

Efficient Domain Adaptation of Sentence Embeddings using Adapters

Sentence embeddings enable us to capture the semantic similarity of shor...
research
10/14/2021

Representation Decoupling for Open-Domain Passage Retrieval

Training dense passage representations via contrastive learning (CL) has...
research
10/30/2022

Generate, Discriminate and Contrast: A Semi-Supervised Sentence Representation Learning Framework

Most sentence embedding techniques heavily rely on expensive human-annot...
research
04/04/2023

A Unified Contrastive Transfer Framework with Propagation Structure for Boosting Low-Resource Rumor Detection

The truth is significantly hampered by massive rumors that spread along ...
research
06/19/2019

Learning Compressed Sentence Representations for On-Device Text Processing

Vector representations of sentences, trained on massive text corpora, ar...
research
05/15/2023

Unsupervised Sentence Representation Learning with Frequency-induced Adversarial Tuning and Incomplete Sentence Filtering

Pre-trained Language Model (PLM) is nowadays the mainstay of Unsupervise...
research
12/09/2022

MED-SE: Medical Entity Definition-based Sentence Embedding

We propose Medical Entity Definition-based Sentence Embedding (MED-SE), ...

Please sign up or login with your details

Forgot password? Click here to reset