Learning sentence embeddings using Recursive Networks

by   Anson Bastos, et al.

Learning sentence vectors that generalise well is a challenging task. In this paper we compare three methods of learning phrase embeddings: 1) Using LSTMs, 2) using recursive nets, 3) A variant of the method 2 using the POS information of the phrase. We train our models on dictionary definitions of words to obtain a reverse dictionary application similar to Felix et al. [1]. To see if our embeddings can be transferred to a new task we also train and test on the rotten tomatoes dataset [2]. We train keeping the sentence embeddings fixed as well as with fine tuning.


page 1

page 2

page 3

page 4


Implementing a Reverse Dictionary, based on word definitions, using a Node-Graph Architecture

In this paper, we outline an approach to build graph-based reverse dicti...

Efficient Domain Adaptation of Sentence Embeddings using Adapters

Sentence embeddings enable us to capture the semantic similarity of shor...

Revisiting Recurrent Networks for Paraphrastic Sentence Embeddings

We consider the problem of learning general-purpose, paraphrastic senten...

A dependency look at the reality of constituency

A comment on "Neurophysiological dynamics of phrase-structure building d...

Unsupervised Learning of Word-Sequence Representations from Scratch via Convolutional Tensor Decomposition

Unsupervised text embeddings extraction is crucial for text understandin...

Actionable Phrase Detection using NLP

Actionable sentences are terms that, in the most basic sense, imply the ...

Please sign up or login with your details

Forgot password? Click here to reset