Unsupervised Sentence-embeddings by Manifold Approximation and Projection

02/07/2021
by   Subhradeep Kayal, et al.
0

The concept of unsupervised universal sentence encoders has gained traction recently, wherein pre-trained models generate effective task-agnostic fixed-dimensional representations for phrases, sentences and paragraphs. Such methods are of varying complexity, from simple weighted-averages of word vectors to complex language-models based on bidirectional transformers. In this work we propose a novel technique to generate sentence-embeddings in an unsupervised fashion by projecting the sentences onto a fixed-dimensional manifold with the objective of preserving local neighbourhoods in the original space. To delineate such neighbourhoods we experiment with several set-distance metrics, including the recently proposed Word Mover's distance, while the fixed-dimensional projection is achieved by employing a scalable and efficient manifold approximation method rooted in topological data analysis. We test our approach, which we term EMAP or Embeddings by Manifold Approximation and Projection, on six publicly available text-classification datasets of varying size and complexity. Empirical results show that our method consistently performs similar to or better than several alternative state-of-the-art approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2023

Ditto: A Simple and Efficient Approach to Improve Sentence Embeddings

Prior studies diagnose the anisotropy problem in sentence representation...
research
10/30/2018

Word Mover's Embedding: From Word2Vec to Document Embedding

While the celebrated Word2Vec technique yields semantically rich represe...
research
03/07/2017

Unsupervised Learning of Sentence Embeddings using Compositional n-Gram Features

The recent tremendous success of unsupervised word embeddings in a multi...
research
05/05/2017

Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

Many modern NLP systems rely on word embeddings, previously trained in a...
research
04/25/2023

Compressing Sentence Representation with maximum Coding Rate Reduction

In most natural language inference problems, sentence representation is ...
research
06/02/2021

Discrete Cosine Transform as Universal Sentence Encoder

Modern sentence encoders are used to generate dense vector representatio...
research
05/14/2021

Out-of-Manifold Regularization in Contextual Embedding Space for Text Classification

Recent studies on neural networks with pre-trained weights (i.e., BERT) ...

Please sign up or login with your details

Forgot password? Click here to reset