On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation

09/30/2019
by   Victor Prokhorov, et al.
0

Variational Autoencoders (VAEs) are known to suffer from learning uninformative latent representation of the input due to issues such as approximated posterior collapse, or entanglement of the latent space. We impose an explicit constraint on the Kullback-Leibler (KL) divergence term inside the VAE objective function. While the explicit constraint naturally avoids posterior collapse, we use it to further understand the significance of the KL term in controlling the information transmitted through the VAE channel. Within this framework, we explore different properties of the estimated posterior distribution, and highlight the trade-off between the amount of information encoded in a latent code during training, and the generative capacity of the model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2019

Bridging the ELBO and MMD

One of the challenges in training generative models such as the variatio...
research
09/06/2023

CR-VAE: Contrastive Regularization on Variational Autoencoders for Preventing Posterior Collapse

The Variational Autoencoder (VAE) is known to suffer from the phenomenon...
research
08/31/2018

Spherical Latent Spaces for Stable Variational Autoencoders

A hallmark of variational autoencoders (VAEs) for text processing is the...
research
10/28/2021

Preventing posterior collapse in variational autoencoders for text generation via decoder regularization

Variational autoencoders trained to minimize the reconstruction error ar...
research
07/03/2020

Variational Autoencoders for Anomalous Jet Tagging

We present a detailed study on Variational Autoencoders (VAEs) for anoma...
research
11/11/2019

Evaluating Combinatorial Generalization in Variational Autoencoders

We evaluate the ability of variational autoencoders to generalize to uns...
research
06/22/2018

Probabilistic Natural Language Generation with Wasserstein Autoencoders

Probabilistic generation of natural language sentences is an important t...

Please sign up or login with your details

Forgot password? Click here to reset