Preventing Posterior Collapse Induced by Oversmoothing in Gaussian VAE

02/17/2021
by   Yuhta Takida, et al.
26

Variational autoencoders (VAEs) often suffer from posterior collapse, which is a phenomenon in which the learned latent space becomes uninformative. This is often related to a hyperparameter resembling the data variance. It can be shown that an inappropriate choice of this parameter causes oversmoothness and leads to posterior collapse in the linearly approximated case and can be empirically verified for the general cases. Therefore, we propose AR-ELBO (Adaptively Regularized Evidence Lower BOund), which controls the smoothness of the model by adapting this variance parameter. In addition, we extend VAE with alternative parameterizations on the variance parameter to deal with non-uniform or conditional data variance. The proposed VAE extensions trained with AR-ELBO show improved Fréchet inception distance (FID) on images generated from the MNIST and CelebA datasets.

READ FULL TEXT

page 14

page 19

page 20

page 21

research
02/23/2020

Variance Loss in Variational Autoencoders

In this article, we highlight what appears to be major issue of Variatio...
research
09/06/2023

CR-VAE: Contrastive Regularization on Variational Autoencoders for Preventing Posterior Collapse

The Variational Autoencoder (VAE) is known to suffer from the phenomenon...
research
06/08/2023

Unscented Autoencoder

The Variational Autoencoder (VAE) is a seminal approach in deep generati...
research
06/08/2020

Variational Variance: Simple and Reliable Predictive Variance Parameterization

An often overlooked sleight of hand performed with variational autoencod...
research
06/23/2020

Simple and Effective VAE Training with Calibrated Decoders

Variational autoencoders (VAEs) provide an effective and simple method f...
research
09/09/2019

Balancing Reconstruction Quality and Regularisation in ELBO for VAEs

A trade-off exists between reconstruction quality and the prior regulari...
research
10/14/2021

The Neglected Sibling: Isotropic Gaussian Posterior for VAE

Deep generative models have been widely used in several areas of NLP, an...

Please sign up or login with your details

Forgot password? Click here to reset