Coverage and Quality Driven Training of Generative Image Models

01/04/2019
by   Konstantin Shmelkov, et al.
0

Generative modeling of natural images has been extensively studied in recent years, yielding remarkable progress. Current state-of-the-art methods are either based on maximum likelihood estimation or adversarial training. Both methods have their own drawbacks, which are complementary in nature. The first leads to over-generalization as the maximum likelihood criterion encourages models to cover the support of the training data by heavily penalizing small masses assigned to training data. Simplifying assumptions in such models limits their capacity and makes them spill mass on unrealistic samples. The second leads to mode-dropping since adversarial training encourages high quality samples from the model, but only indirectly enforces diversity among the samples. To overcome these drawbacks we make two contributions. First, we propose a novel extension to the variational autoencoders model by using deterministic invertible transformation layers to map samples from the decoder to the image space. This induces correlations among the pixels given the latent variables, improving over commonly used factorial decoders. Second, we propose a training approach that leverages coverage and quality based criteria. Our models obtain likelihood scores competitive with state-of-the-art likelihood-based models, while achieving sample quality typical of adversarially trained networks.

READ FULL TEXT
research
05/29/2019

Semi-Implicit Generative Model

To combine explicit and implicit generative models, we introduce semi-im...
research
11/16/2015

How (not) to Train your Generative Model: Scheduled Sampling, Likelihood, Adversary?

Modern applications and progress in deep learning research have created ...
research
02/23/2021

EBMs Trained with Maximum Likelihood are Generator Models Trained with a Self-adverserial Loss

Maximum likelihood estimation is widely used in training Energy-based mo...
research
11/20/2019

Adversarial Robustness of Flow-Based Generative Models

Flow-based generative models leverage invertible generator functions to ...
research
10/19/2022

Autoregressive Generative Modeling with Noise Conditional Maximum Likelihood Estimation

We introduce a simple modification to the standard maximum likelihood es...
research
07/11/2021

Dual Training of Energy-Based Models with Overparametrized Shallow Neural Networks

Energy-based models (EBMs) are generative models that are usually traine...
research
07/13/2020

Bridging Maximum Likelihood and Adversarial Learning via α-Divergence

Maximum likelihood (ML) and adversarial learning are two popular approac...

Please sign up or login with your details

Forgot password? Click here to reset