Neural Gaussian Copula for Variational Autoencoder

09/09/2019
by   Prince Zizhuang Wang, et al.
0

Variational language models seek to estimate the posterior of latent variables with an approximated variational posterior. The model often assumes the variational posterior to be factorized even when the true posterior is not. The learned variational posterior under this assumption does not capture the dependency relationships over latent variables. We argue that this would cause a typical training problem called posterior collapse observed in all other variational language models. We propose Gaussian Copula Variational Autoencoder (VAE) to avert this problem. Copula is widely used to model correlation and dependencies of high-dimensional random variables, and therefore it is helpful to maintain the dependency relationships that are lost in VAE. The empirical results show that by modeling the correlation of latent variables explicitly using a neural parametric copula, we can avert this training difficulty while getting competitive results among all other VAE approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2020

tvGP-VAE: Tensor-variate Gaussian Process Prior Variational Autoencoder

Variational autoencoders (VAEs) are a powerful class of deep generative ...
research
01/30/2019

Enhanced Variational Inference with Dyadic Transformation

Variational autoencoder is a powerful deep generative model with variati...
research
03/19/2021

Adversarial and Contrastive Variational Autoencoder for Sequential Recommendation

Sequential recommendation as an emerging topic has attracted increasing ...
research
10/06/2017

Learnable Explicit Density for Continuous Latent Space and Variational Inference

In this paper, we study two aspects of the variational autoencoder (VAE)...
research
04/30/2020

Preventing Posterior Collapse with Levenshtein Variational Autoencoder

Variational autoencoders (VAEs) are a standard framework for inducing la...
research
01/31/2020

CosmoVAE: Variational Autoencoder for CMB Image Inpainting

Cosmic microwave background radiation (CMB) is critical to the understan...
research
12/09/2019

Variational Autoencoder Trajectory Primitives with Continuous and Discrete Latent Codes

Imitation learning is an intuitive approach for teaching motion to robot...

Please sign up or login with your details

Forgot password? Click here to reset