Sentence Representations via Gaussian Embedding

05/22/2023
by   Shohei Yoda, et al.
0

Recent progress in sentence embedding, which represents the meaning of a sentence as a point in a vector space, has achieved high performance on tasks such as a semantic textual similarity (STS) task. However, sentence representations as a point in a vector space can express only a part of the diverse information that sentences have, such as asymmetrical relationships between sentences. This paper proposes GaussCSE, a Gaussian distribution-based contrastive learning framework for sentence embedding that can handle asymmetric relationships between sentences, along with a similarity measure for identifying inclusion relations. Our experiments show that GaussCSE achieves the same performance as previous methods in natural language inference tasks, and is able to estimate the direction of entailment relations, which is difficult with point representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2019

In Search for Linear Relations in Sentence Embedding Spaces

We present an introductory investigation into continuous-space vector re...
research
08/08/2023

A Comparative Study of Sentence Embedding Models for Assessing Semantic Variation

Analyzing the pattern of semantic variation in long real-world texts suc...
research
07/27/2018

Neural Sentence Embedding using Only In-domain Sentences for Out-of-domain Sentence Detection in Dialog Systems

To ensure satisfactory user experience, dialog systems must be able to d...
research
09/16/2013

Domain and Function: A Dual-Space Model of Semantic Relations and Compositions

Given appropriate representations of the semantic relations between carp...
research
04/13/2020

Integrated Eojeol Embedding for Erroneous Sentence Classification in Korean Chatbots

This paper attempts to analyze the Korean sentence classification system...
research
08/30/2020

SEEC: Semantic Vector Federation across Edge Computing Environments

Semantic vector embedding techniques have proven useful in learning sema...
research
01/18/2018

Natural Language Multitasking: Analyzing and Improving Syntactic Saliency of Hidden Representations

We train multi-task autoencoders on linguistic tasks and analyze the lea...

Please sign up or login with your details

Forgot password? Click here to reset