Quasi-symplectic Langevin Variational Autoencoder

09/02/2020
by   Zihao Wang, et al.
0

Variational autoencoder (VAE) as one of the well investigated generative model is very popular in nowadays neural learning research works. To leverage VAE in practical tasks which have high dimensions and huge dataset often face the problem of low variance evidence lower bounds construction. Markov chain Monte Carlo (MCMC) is an effective approach to tight the evidence lower bound (ELBO) for approximating the posterior distribution. Hamiltonian Variational Autoencoder (HVAE) is one of the effective MCMC inspired approaches for constructing the unbiased low-variance ELBO which is also amenable for reparameterization trick. The solution significantly improves the performance of the posterior estimation effectiveness, yet, a main drawback of HVAE is the leapfrog method need to access the posterior gradient twice which leads to bad inference efficiency performance and the GPU memory requirement is fair large. This flaw limited the application of Hamiltonian based inference framework for large scale networks inference. To tackle this problem, we propose a Quasi-symplectic Langevin Variational autoencoder (Langevin-VAE), which can be a significant improvement over resource usage efficiency. We qualitatively and quantitatively demonstrate the effectiveness of the Langevin-VAE compared to the state-of-art gradients informed inference framework.

READ FULL TEXT
research
07/21/2019

Tutorial: Deriving the Standard Variational Autoencoder (VAE) Loss Function

In Bayesian machine learning, the posterior distribution is typically co...
research
05/29/2018

Hamiltonian Variational Auto-Encoder

Variational Auto-Encoders (VAEs) have become very popular techniques to ...
research
08/26/2023

Learning variational autoencoders via MCMC speed measures

Variational autoencoders (VAEs) are popular likelihood-based generative ...
research
05/23/2018

Amortized Inference Regularization

The variational autoencoder (VAE) is a popular model for density estimat...
research
07/08/2021

MCMC Variational Inference via Uncorrected Hamiltonian Annealing

Given an unnormalized target distribution we want to obtain approximate ...
research
05/20/2018

Conditional Inference in Pre-trained Variational Autoencoders via Cross-coding

Variational Autoencoders (VAEs) are a popular generative model, but one ...
research
03/04/2018

WHAI: Weibull Hybrid Autoencoding Inference for Deep Topic Modeling

To train an inference network jointly with a deep generative topic model...

Please sign up or login with your details

Forgot password? Click here to reset