Probabilistic Natural Language Generation with Wasserstein Autoencoders

06/22/2018
by   Hareesh Bahuleyan, et al.
0

Probabilistic generation of natural language sentences is an important task in NLP. Existing models such as variational autoencoders (VAE) for sequence generation are extremely difficult to train due to the issues associated with the Kullback-Leibler (KL) loss collapsing to zero. One has to implement various heuristics such as KL weight annealing and word dropout in a carefully engineered manner to successfully train a text VAE. In this paper, we propose the use of Wasserstein autoencoders (WAE) for probabilistic natural language sentence generation. We show that sequence-to-sequence WAEs are more robust towards hyperparameters and can be trained in a straightforward manner without the need for any weight annealing. Empirical evidence shows that the latent space learned by WAEs exhibits properties of continuity and smoothness as in VAEs, while simultaneously achieving much higher BLEU scores for sentence reconstruction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2018

Natural Language Generation with Neural Variational Models

In this thesis, we explore the use of deep neural networks for generatio...
research
03/25/2019

Cyclical Annealing Schedule: A Simple Approach to Mitigating KL Vanishing

Variational autoencoders (VAEs) with an auto-regressive decoder have bee...
research
09/30/2019

On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation

Variational Autoencoders (VAEs) are known to suffer from learning uninfo...
research
07/28/2020

Novel Potential Inhibitors Against SARS-CoV-2 Using Artificial Intelligence

Abstract Since known approved drugs like liponavir and ritonavir failed ...
research
04/21/2020

Discrete Variational Attention Models for Language Generation

Variational autoencoders have been widely applied for natural language g...
research
04/16/2020

Do sequence-to-sequence VAEs learn global features of sentences?

A longstanding goal in NLP is to compute global sentence representations...
research
11/11/2019

Evaluating Combinatorial Generalization in Variational Autoencoders

We evaluate the ability of variational autoencoders to generalize to uns...

Please sign up or login with your details

Forgot password? Click here to reset