Exploiting Invertible Decoders for Unsupervised Sentence Representation Learning

09/08/2018
by   Shuai Tang, et al.
0

The encoder-decoder models for unsupervised sentence representation learning tend to discard the decoder after being trained on a large unlabelled corpus, since only the encoder is needed to map the input sentence into a vector representation. However, parameters learnt in the decoder also contain useful information about language. In order to utilise the decoder after learning, we present two types of decoding functions whose inverse can be easily derived without expensive inverse calculation. Therefore, the inverse of the decoding function serves as another encoder that produces sentence representations. We show that, with careful design of the decoding functions, the model learns good sentence representations, and the ensemble of the representations produced from the encoder and the inverse of the decoder demonstrate even better generalisation ability and solid transferability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2017

Exploring Asymmetric Encoder-Decoder Structure for Context-based Sentence Representation Learning

Context information plays an important role in human language understand...
research
04/22/2018

Unsupervised Discrete Sentence Representation Learning for Interpretable Neural Dialog Generation

The encoder-decoder dialog model is one of the most prominent methods us...
research
03/13/2018

Narcissus: Deriving Correct-By-Construction Decoders and Encoders from Binary Formats

Every injective function has an inverse, although constructing the inver...
research
11/02/2022

Verified Reversible Programming for Verified Lossless Compression

Lossless compression implementations typically contain two programs, an ...
research
08/31/2021

A manifold learning perspective on representation learning: Learning decoder and representations without an encoder

Autoencoders are commonly used in representation learning. They consist ...
research
03/07/2018

An efficient framework for learning sentence representations

In this work we propose a simple and efficient framework for learning se...
research
07/26/2016

Tweet2Vec: Learning Tweet Embeddings Using Character-level CNN-LSTM Encoder-Decoder

We present Tweet2Vec, a novel method for generating general-purpose vect...

Please sign up or login with your details

Forgot password? Click here to reset