Towards Universal Paraphrastic Sentence Embeddings

11/25/2015
by   John Wieting, et al.
0

We consider the problem of learning general-purpose, paraphrastic sentence embeddings based on supervision from the Paraphrase Database (Ganitkevitch et al., 2013). We compare six compositional architectures, evaluating them on annotated textual similarity datasets drawn both from the same distribution as the training data and from a wide range of other domains. We find that the most complex architectures, such as long short-term memory (LSTM) recurrent neural networks, perform best on the in-domain data. However, in out-of-domain scenarios, simple architectures such as word averaging vastly outperform LSTMs. Our simplest averaging model is even competitive with systems tuned for the particular tasks while also being extremely efficient and easy to use. In order to better understand how these architectures compare, we conduct further experiments on three supervised NLP tasks: sentence similarity, entailment, and sentiment classification. We again find that the word averaging models perform well for sentence similarity and entailment, outperforming LSTMs. However, on sentiment classification, we find that the LSTM performs very strongly-even recording new state-of-the-art performance on the Stanford Sentiment Treebank. We then demonstrate how to combine our pretrained sentence embeddings with these supervised tasks, using them both as a prior and as a black box feature extractor. This leads to performance rivaling the state of the art on the SICK similarity and entailment tasks. We release all of our resources to the research community with the hope that they can serve as the new baseline for further work on universal sentence embeddings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2017

Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings

We consider the problem of learning general-purpose, paraphrastic senten...
research
05/18/2018

Suffix Bidirectional Long Short-Term Memory

Recurrent neural networks have become ubiquitous in computing representa...
research
07/10/2016

Charagram: Embedding Words and Sentences via Character n-grams

We present Charagram embeddings, a simple approach for learning characte...
research
02/12/2018

Evaluating Compositionality in Sentence Embeddings

An important frontier in the quest for human-like AI is compositional se...
research
06/05/2020

DeCLUTR: Deep Contrastive Learning for Unsupervised Textual Representations

We present DeCLUTR: Deep Contrastive Learning for Unsupervised Textual R...
research
09/13/2017

Assessing State-of-the-Art Sentiment Models on State-of-the-Art Sentiment Datasets

There has been a good amount of progress in sentiment analysis over the ...
research
05/23/2018

Scoring Lexical Entailment with a Supervised Directional Similarity Network

We present the Supervised Directional Similarity Network (SDSN), a novel...

Please sign up or login with your details

Forgot password? Click here to reset