On the Quantitative Analysis of Decoder-Based Generative Models

11/14/2016
by   Yuhuai Wu, et al.
0

The past several years have seen remarkable progress in generative models which produce convincing samples of images and other modalities. A shared component of many powerful generative models is a decoder network, a parametric deep neural net that defines a generative distribution. Examples include variational autoencoders, generative adversarial networks, and generative moment matching networks. Unfortunately, it can be difficult to quantify the performance of these models because of the intractability of log-likelihood estimation, and inspecting samples can be misleading. We propose to use Annealed Importance Sampling for evaluating log-likelihoods for decoder-based models and validate its accuracy using bidirectional Monte Carlo. The evaluation code is provided at https://github.com/tonywu95/eval_gen. Using this technique, we analyze the performance of decoder-based models, the effectiveness of existing log-likelihood estimators, the degree of overfitting, and the degree to which these models miss important modes of the data distribution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2018

Glow: Generative Flow with Invertible 1x1 Convolutions

Flow-based generative models (Dinh et al., 2014) are conceptually attrac...
research
06/23/2019

Bias Correction of Learned Generative Models using Likelihood-Free Importance Weighting

A learned generative model often produces biased statistics relative to ...
research
10/02/2014

Deep Directed Generative Autoencoders

For discrete data, the likelihood P(x) can be rewritten exactly and para...
research
08/15/2020

Evaluating Lossy Compression Rates of Deep Generative Models

The field of deep generative modeling has succeeded in producing astonis...
research
07/01/2020

In-Distribution Interpretability for Challenging Modalities

It is widely recognized that the predictions of deep neural networks are...
research
01/19/2017

PixelCNN++: Improving the PixelCNN with Discretized Logistic Mixture Likelihood and Other Modifications

PixelCNNs are a recently proposed class of powerful generative models wi...
research
08/15/2022

Applying Regularized Schrödinger-Bridge-Based Stochastic Process in Generative Modeling

Compared to the existing function-based models in deep generative modeli...

Please sign up or login with your details

Forgot password? Click here to reset