VAE-KRnet and its applications to variational Bayes

06/29/2020
by   Xiaoliang Wan, et al.
0

In this work, we have proposed a generative model for density estimation, called VAE-KRnet, which combines the canonical variational autoencoder (VAE) with our recently developed flow-based generative model, called KRnet. VAE is used as a dimension reduction technique to capture the latent space, and KRnet is used to model the distribution of the latent variables. Using a linear model between the data and the latent variables, we show that VAE-KRnet can be more effective and robust than the canonical VAE. As an application, we apply VAE-KRnet to variational Bayes to approximate the posterior. The variational Bayes approaches are usually based on the minimization of the Kullback-Leibler (KL) divergence between the model and the posterior, which often underestimates the variance if the model capability is not sufficiently strong. However, for high-dimensional distributions, it is very challenging to construct an accurate model since extra assumptions are often needed for efficiency, e.g., the mean-field approach assumes mutual independence between dimensions. When the number of dimensions is relatively small, KRnet can be used to approximate the posterior effectively with respect to the original random variable. For high-dimensional cases, we consider VAE-KRnet to incorporate with the dimension reduction. To alleviate the underestimation of the variance, we include the maximization of the mutual information between the latent random variable and the original one when seeking an approximate distribution with respect to the KL divergence. Numerical experiments have been presented to demonstrate the effectiveness of our model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2018

Variational Autoencoder with Implicit Optimal Priors

The variational autoencoder (VAE) is a powerful generative model that ca...
research
01/06/2019

MAE: Mutual Posterior-Divergence Regularization for Variational AutoEncoders

Variational Autoencoder (VAE), a simple and effective deep generative mo...
research
06/08/2023

Unscented Autoencoder

The Variational Autoencoder (VAE) is a seminal approach in deep generati...
research
04/27/2020

A Batch Normalized Inference Network Keeps the KL Vanishing Away

Variational Autoencoder (VAE) is widely used as a generative model to ap...
research
12/22/2018

Disentangling Latent Space for VAE by Label Relevant/Irrelevant Dimensions

VAE requires the standard Gaussian distribution as a prior in the latent...
research
07/30/2020

Quantitative Understanding of VAE by Interpreting ELBO as Rate Distortion Cost of Transform Coding

VAE (Variational autoencoder) estimates the posterior parameters (mean a...
research
09/25/2019

Explicitly disentangling image content from translation and rotation with spatial-VAE

Given an image dataset, we are often interested in finding data generati...

Please sign up or login with your details

Forgot password? Click here to reset