Preventing posterior collapse in variational autoencoders for text generation via decoder regularization

10/28/2021
by   Alban Petit, et al.
0

Variational autoencoders trained to minimize the reconstruction error are sensitive to the posterior collapse problem, that is the proposal posterior distribution is always equal to the prior. We propose a novel regularization method based on fraternal dropout to prevent posterior collapse. We evaluate our approach using several metrics and observe improvements in all the tested configurations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2022

Improving Variational Autoencoders with Density Gap-based Regularization

Variational autoencoders (VAEs) are one of the powerful unsupervised lea...
research
09/30/2019

On the Importance of the Kullback-Leibler Divergence Term in Variational Autoencoders for Text Generation

Variational Autoencoders (VAEs) are known to suffer from learning uninfo...
research
09/22/2021

LDC-VAE: A Latent Distribution Consistency Approach to Variational AutoEncoders

Variational autoencoders (VAEs), as an important aspect of generative mo...
research
06/07/2017

InfoVAE: Information Maximizing Variational Autoencoders

It has been previously observed that variational autoencoders tend to ig...
research
04/21/2018

Eval all, trust a few, do wrong to none: Comparing sentence generation models

In this paper, we study recent neural generative models for text generat...
research
06/08/2023

Posterior Collapse in Linear Conditional and Hierarchical Variational Autoencoders

The posterior collapse phenomenon in variational autoencoders (VAEs), wh...
research
11/19/2018

Variational Bayesian Dropout

Variational dropout (VD) is a generalization of Gaussian dropout, which ...

Please sign up or login with your details

Forgot password? Click here to reset