Sentence Meta-Embeddings for Unsupervised Semantic Textual Similarity

11/09/2019
by   Nina Poerner, et al.
0

We address the task of unsupervised Semantic Textual Similarity (STS) by ensembling diverse pre-trained sentence encoders into sentence meta-embeddings. We apply and extend different meta-embedding methods from the word embedding literature, including dimensionality reduction (Yin and Schütze, 2016), generalized Canonical Correlation Analysis (Rastogi et al., 2015) and cross-view autoencoders (Bollegala and Bao, 2018). We set a new unsupervised State of The Art (SoTA) on the STS Benchmark and on the STS12-STS16 datasets, with gains of between 3.7

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/16/2022

Unsupervised Attention-based Sentence-Level Meta-Embeddings from Contextualised Language Models

A variety of contextualised language models have been proposed in the NL...
research
07/20/2023

Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models

Jina Embeddings constitutes a set of high-performance sentence embedding...
research
04/20/2018

Sequential Network Transfer: Adapting Sentence Embeddings to Human Activities and Beyond

We study the problem of adapting neural sentence embedding models to the...
research
03/29/2021

Extending Multi-Sense Word Embedding to Phrases and Sentences for Unsupervised Semantic Applications

Most unsupervised NLP models represent each word with a single point or ...
research
11/16/2018

Investigating the Effects of Word Substitution Errors on Sentence Embeddings

A key initial step in several natural language processing (NLP) tasks in...
research
11/15/2017

Pushing the Limits of Paraphrastic Sentence Embeddings with Millions of Machine Translations

We extend the work of Wieting et al. (2017), back-translating a large pa...
research
07/14/2022

A Meta-learning Formulation of the Autoencoder Problem

A rapidly growing area of research is the use of machine learning approa...

Please sign up or login with your details

Forgot password? Click here to reset