DeepAI AI Chat
Log In Sign Up

Improving Disentangled Text Representation Learning with Information-Theoretic Guidance

06/01/2020
by   Pengyu Cheng, et al.
Duke University
NEC Laboratories America
Microsoft
5

Learning disentangled representations of natural language is essential for many NLP tasks, e.g., conditional text generation, style transfer, personalized dialogue systems, etc. Similar problems have been studied extensively for other forms of data, such as images and videos. However, the discrete nature of natural language makes the disentangling of textual representations more challenging (e.g., the manipulation over the data space cannot be easily achieved). Inspired by information theory, we propose a novel method that effectively manifests disentangled representations of text, without any supervision on semantics. A new mutual information upper bound is derived and leveraged to measure dependence between style and content. By minimizing this upper bound, the proposed method induces style and content embeddings into two independent low-dimensional spaces. Experiments on both conditional text generation and text-style transfer demonstrate the high quality of our disentangled representation in terms of content and style preservation.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/06/2021

A Novel Estimator of Mutual Information for Learning to Disentangle Textual Representations

Learning disentangled representations of textual data is essential for m...
08/13/2018

Disentangled Representation Learning for Text Style Transfer

This paper tackles the problem of disentangling the latent variables of ...
12/22/2017

Disentangled Representations for Manipulation of Sentiment in Text

The ability to change arbitrary aspects of a text while leaving the core...
06/02/2022

Disentangled Generation Network for Enlarged License Plate Recognition and A Unified Dataset

License plate recognition plays a critical role in many practical applic...
09/15/2021

Disentangling Generative Factors in Natural Language with Discrete Variational Autoencoders

The ability of learning disentangled representations represents a major ...
02/27/2022

Variational Autoencoder with Disentanglement Priors for Low-Resource Task-Specific Natural Language Generation

In this paper, we propose a variational autoencoder with disentanglement...
05/21/2018

Invariant Representations from Adversarially Censored Autoencoders

We combine conditional variational autoencoders (VAE) with adversarial c...

Code Repositories