LDC-VAE: A Latent Distribution Consistency Approach to Variational AutoEncoders

09/22/2021
by   Xiaoyu Chen, et al.
0

Variational autoencoders (VAEs), as an important aspect of generative models, have received a lot of research interests and reached many successful applications. However, it is always a challenge to achieve the consistency between the learned latent distribution and the prior latent distribution when optimizing the evidence lower bound (ELBO), and finally leads to an unsatisfactory performance in data generation. In this paper, we propose a latent distribution consistency approach to avoid such substantial inconsistency between the posterior and prior latent distributions in ELBO optimizing. We name our method as latent distribution consistency VAE (LDC-VAE). We achieve this purpose by assuming the real posterior distribution in latent space as a Gibbs form, and approximating it by using our encoder. However, there is no analytical solution for such Gibbs posterior in approximation, and traditional approximation ways are time consuming, such as using the iterative sampling-based MCMC. To address this problem, we use the Stein Variational Gradient Descent (SVGD) to approximate the Gibbs posterior. Meanwhile, we use the SVGD to train a sampler net which can obtain efficient samples from the Gibbs posterior. Comparative studies on the popular image generation datasets show that our method has achieved comparable or even better performance than several powerful improvements of VAEs.

READ FULL TEXT
research
02/20/2023

Analyzing the Posterior Collapse in Hierarchical Variational Autoencoders

Hierarchical Variational Autoencoders (VAEs) are among the most popular ...
research
07/19/2023

Symmetric Equilibrium Learning of VAEs

We view variational autoencoders (VAE) as decoder-encoder pairs, which m...
research
10/06/2020

NCP-VAE: Variational Autoencoders with Noise Contrastive Priors

Variational autoencoders (VAEs) are one of the powerful likelihood-based...
research
08/17/2023

Conditional Sampling of Variational Autoencoders via Iterated Approximate Ancestral Sampling

Conditional sampling of variational autoencoders (VAEs) is needed in var...
research
06/14/2021

A learned conditional prior for the VAE acoustic space of a TTS system

Many factors influence speech yielding different renditions of a given s...
research
10/28/2021

Preventing posterior collapse in variational autoencoders for text generation via decoder regularization

Variational autoencoders trained to minimize the reconstruction error ar...
research
10/14/2021

The Neglected Sibling: Isotropic Gaussian Posterior for VAE

Deep generative models have been widely used in several areas of NLP, an...

Please sign up or login with your details

Forgot password? Click here to reset