DeepAI
Log In Sign Up

Prescribed Generative Adversarial Networks

10/09/2019
by   Adji B. Dieng, et al.
25

Generative adversarial networks (GANs) are a powerful approach to unsupervised learning. They have achieved state-of-the-art performance in the image domain. However, GANs are limited in two ways. They often learn distributions with low support—a phenomenon known as mode collapse—and they do not guarantee the existence of a probability density, which makes evaluating generalization using predictive log-likelihood impossible. In this paper, we develop the prescribed GAN (PresGAN) to address these shortcomings. PresGANs add noise to the output of a density network and optimize an entropy-regularized adversarial loss. The added noise renders tractable approximations of the predictive log-likelihood and stabilizes the training procedure. The entropy regularizer encourages PresGANs to capture all the modes of the data distribution. Fitting PresGANs involves computing the intractable gradients of the entropy regularization term; PresGANs sidestep this intractability using unbiased stochastic estimates. We evaluate PresGANs on several datasets and found they mitigate mode collapse and generate samples with high perceptual quality. We further found that PresGANs reduce the gap in performance in terms of predictive log-likelihood between traditional GANs and variational autoencoders (VAEs).

READ FULL TEXT

page 2

page 18

page 28

09/27/2019

"Best-of-Many-Samples" Distribution Matching

Generative Adversarial Networks (GANs) can achieve state-of-the-art samp...
10/09/2018

Entropic GANs meet VAEs: A Statistical Approach to Compute Sample Likelihoods in GANs

Building on the success of deep learning, two modern approaches to learn...
11/04/2019

Improved BiGAN training with marginal likelihood equalization

We propose a novel training procedure for improving the performance of g...
03/07/2017

Stopping GAN Violence: Generative Unadversarial Networks

While the costs of human violence have attracted a great deal of attenti...
01/24/2019

Maximum Entropy Generators for Energy-Based Models

Unsupervised learning is about capturing dependencies between variables ...
01/28/2019

Out-of-Sample Testing for GANs

We propose a new method to evaluate GANs, namely EvalGAN. EvalGAN relies...
09/24/2020

GANs with Variational Entropy Regularizers: Applications in Mitigating the Mode-Collapse Issue

Building on the success of deep learning, Generative Adversarial Network...

Code Repositories

PresGANs

Prescribed Generative Adversarial Networks


view repo