Imposing Relation Structure in Language-Model Embeddings Using Contrastive Learning

09/02/2021
by   Christos Theodoropoulos, et al.
4

Though language model text embeddings have revolutionized NLP research, their ability to capture high-level semantic information, such as relations between entities in text, is limited. In this paper, we propose a novel contrastive learning framework that trains sentence embeddings to encode the relations in a graph structure. Given a sentence (unstructured text) and its graph, we use contrastive learning to impose relation-related structure on the token-level representations of the sentence obtained with a CharacterBERT (El Boukkouri et al.,2020) model. The resulting relation-aware sentence embeddings achieve state-of-the-art results on the relation extraction task using only a simple KNN classifier, thereby demonstrating the success of the proposed method. Additional visualization by a tSNE analysis shows the effectiveness of the learned representation space compared to baselines. Furthermore, we show that we can learn a different space for named entity recognition, again using a contrastive learning objective, and demonstrate how to successfully combine both representation spaces in an entity-relation task.

READ FULL TEXT
research
04/11/2023

Sentence-Level Relation Extraction via Contrastive Learning with Descriptive Relation Prompts

Sentence-level relation extraction aims to identify the relation between...
research
05/09/2023

StrAE: Autoencoding for Pre-Trained Embeddings using Explicit Structure

This work explores the utility of explicit structure for representation ...
research
10/08/2022

InfoCSE: Information-aggregated Contrastive Learning of Sentence Embeddings

Contrastive learning has been extensively studied in sentence embedding ...
research
08/08/2022

Generating Coherent Narratives by Learning Dynamic and Discrete Entity States with a Contrastive Framework

Despite advances in generating fluent texts, existing pretraining models...
research
02/26/2022

Toward Interpretable Semantic Textual Similarity via Optimal Transport-based Contrastive Sentence Learning

Recently, finetuning a pretrained language model to capture the similari...
research
05/23/2023

TaDSE: Template-aware Dialogue Sentence Embeddings

Learning high quality sentence embeddings from dialogues has drawn incre...
research
09/01/2022

Multi-Scale Contrastive Co-Training for Event Temporal Relation Extraction

Extracting temporal relationships between pairs of events in texts is a ...

Please sign up or login with your details

Forgot password? Click here to reset