DeepAI AI Chat
Log In Sign Up

In Search for Linear Relations in Sentence Embedding Spaces

by   Petra Barančíková, et al.
Charles University in Prague

We present an introductory investigation into continuous-space vector representations of sentences. We acquire pairs of very similar sentences differing only by a small alterations (such as change of a noun, adding an adjective, noun or punctuation) from datasets for natural language inference using a simple pattern method. We look into how such a small change within the sentence text affects its representation in the continuous space and how such alterations are reflected by some of the popular sentence embedding models. We found that vector differences of some embeddings actually reflect small changes within a sentence.


page 1

page 2

page 3

page 4


Sentence Representations via Gaussian Embedding

Recent progress in sentence embedding, which represents the meaning of a...

Semantic Sentence Embeddings for Paraphrasing and Text Summarization

This paper introduces a sentence to vector encoding framework suitable f...

Natural Language Multitasking: Analyzing and Improving Syntactic Saliency of Hidden Representations

We train multi-task autoencoders on linguistic tasks and analyze the lea...

Language coverage and generalization in RNN-based continuous sentence embeddings for interacting agents

Continuous sentence embeddings using recurrent neural networks (RNNs), w...

Adversarial Decomposition of Text Representation

In this paper, we present a method for adversarial decomposition of text...

Multilevel Sentence Embeddings for Personality Prediction

Representing text into a multidimensional space can be done with sentenc...

Paraphrase Thought: Sentence Embedding Module Imitating Human Language Recognition

Sentence embedding is an important research topic in natural language pr...