DeepAI AI Chat
Log In Sign Up

In Search for Linear Relations in Sentence Embedding Spaces

10/08/2019
by   Petra Barančíková, et al.
Charles University in Prague
0

We present an introductory investigation into continuous-space vector representations of sentences. We acquire pairs of very similar sentences differing only by a small alterations (such as change of a noun, adding an adjective, noun or punctuation) from datasets for natural language inference using a simple pattern method. We look into how such a small change within the sentence text affects its representation in the continuous space and how such alterations are reflected by some of the popular sentence embedding models. We found that vector differences of some embeddings actually reflect small changes within a sentence.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/22/2023

Sentence Representations via Gaussian Embedding

Recent progress in sentence embedding, which represents the meaning of a...
09/26/2018

Semantic Sentence Embeddings for Paraphrasing and Text Summarization

This paper introduces a sentence to vector encoding framework suitable f...
01/18/2018

Natural Language Multitasking: Analyzing and Improving Syntactic Saliency of Hidden Representations

We train multi-task autoencoders on linguistic tasks and analyze the lea...
11/05/2019

Language coverage and generalization in RNN-based continuous sentence embeddings for interacting agents

Continuous sentence embeddings using recurrent neural networks (RNNs), w...
08/27/2018

Adversarial Decomposition of Text Representation

In this paper, we present a method for adversarial decomposition of text...
05/09/2023

Multilevel Sentence Embeddings for Personality Prediction

Representing text into a multidimensional space can be done with sentenc...
08/16/2018

Paraphrase Thought: Sentence Embedding Module Imitating Human Language Recognition

Sentence embedding is an important research topic in natural language pr...