VAEBM: A Symbiosis between Variational Autoencoders and Energy-based Models

10/01/2020
by   Zhisheng Xiao, et al.
23

Energy-based models (EBMs) have recently been successful in representing complex distributions of small images. However, sampling from them requires expensive Markov chain Monte Carlo (MCMC) iterations that mix slowly in high dimensional pixel space. Unlike EBMs, variational autoencoders (VAEs) generate samples quickly and are equipped with a latent space that enables fast traversal of the data manifold. However, VAEs tend to assign high probability density to regions in data space outside the actual data distribution and often fail at generating sharp images. In this paper, we propose VAEBM, a symbiotic composition of a VAE and an EBM that offers the best of both worlds. VAEBM captures the overall mode structure of the data distribution using a state-of-the-art VAE and it relies on its EBM component to explicitly exclude non-data-like regions from the model and refine the image samples. Moreover, the VAE component in VAEBM allows us to speed up MCMC updates by reparameterizing them in the VAE's latent space. Our experimental results show that VAEBM outperforms state-of-the-art VAEs and EBMs in generative quality on several benchmark image datasets by a large margin. It can generate high-quality images as large as 256×256 pixels with short MCMC chains. We also demonstrate that VAEBM provides complete mode coverage and performs well in out-of-distribution detection.

READ FULL TEXT

page 21

page 24

page 26

page 27

page 28

page 29

page 30

page 32

research
12/29/2020

Learning Energy-Based Model with Variational Auto-Encoder as Amortized Sampler

Due to the intractable partition function, training energy-based models ...
research
10/28/2016

Improving Sampling from Generative Autoencoders with Markov Chains

We focus on generative autoencoders, such as variational or adversarial ...
research
02/05/2023

Latent Reconstruction-Aware Variational Autoencoder

Variational Autoencoders (VAEs) have become increasingly popular in rece...
research
10/06/2020

NCP-VAE: Variational Autoencoders with Noise Contrastive Priors

Variational autoencoders (VAEs) are one of the powerful likelihood-based...
research
07/08/2020

NVAE: A Deep Hierarchical Variational Autoencoder

Normalizing flows, autoregressive models, variational autoencoders (VAEs...
research
05/20/2023

Normalizing flow sampling with Langevin dynamics in the latent space

Normalizing flows (NF) use a continuous generator to map a simple latent...
research
02/04/2023

PartitionVAE – a human-interpretable VAE

VAEs, or variational autoencoders, are autoencoders that explicitly lear...

Please sign up or login with your details

Forgot password? Click here to reset