Stylized Text Generation Using Wasserstein Autoencoders with a Mixture of Gaussian Prior

11/10/2019
by   Amirpasha Ghabussi, et al.
0

Wasserstein autoencoders are effective for text generation. They do not however provide any control over the style and topic of the generated sentences if the dataset has multiple classes and includes different topics. In this work, we present a semi-supervised approach for generating stylized sentences. Our model is trained on a multi-class dataset and learns the latent representation of the sentences using a mixture of Gaussian prior without any adversarial losses. This allows us to generate sentences in the style of a specified class or multiple classes by sampling from their corresponding prior distributions. Moreover, we can train our model on relatively small datasets and learn the latent representation of a specified class by adding external data with other styles/classes to our dataset. While a simple WAE or VAE cannot generate diverse sentences in this case, generated sentences with our approach are diverse, fluent, and preserve the style and the content of the desired classes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2019

Topic-Guided Variational Autoencoders for Text Generation

We propose a topic-guided variational autoencoder (TGVAE) model for text...
research
05/29/2019

Latent Space Secrets of Denoising Text-Autoencoders

While neural language models have recently demonstrated impressive perfo...
research
06/16/2019

Fixing Gaussian Mixture VAEs for Interpretable Text Generation

Variational auto-encoder (VAE) with Gaussian priors is effective in text...
research
04/30/2020

Control, Generate, Augment: A Scalable Framework for Multi-Attribute Text Generation

In this work, we present a text generation approach with multi-attribute...
research
06/14/2023

DiffuDetox: A Mixed Diffusion Model for Text Detoxification

Text detoxification is a conditional text generation task aiming to remo...
research
06/15/2021

Unsupervised Abstractive Opinion Summarization by Generating Sentences with Tree-Structured Topic Guidance

This paper presents a novel unsupervised abstractive summarization metho...
research
06/21/2019

SeGMA: Semi-Supervised Gaussian Mixture Auto-Encoder

We propose a semi-supervised generative model, SeGMA, which learns a joi...

Please sign up or login with your details

Forgot password? Click here to reset