Adversarial Likelihood Estimation with One-way Flows

07/19/2023
by   Omri Ben-Dov, et al.
0

Generative Adversarial Networks (GANs) can produce high-quality samples, but do not provide an estimate of the probability density around the samples. However, it has been noted that maximizing the log-likelihood within an energy-based setting can lead to an adversarial framework where the discriminator provides unnormalized density (often called energy). We further develop this perspective, incorporate importance sampling, and show that 1) Wasserstein GAN performs a biased estimate of the partition function, and we propose instead to use an unbiased estimator; 2) when optimizing for likelihood, one must maximize generator entropy. This is hypothesized to provide a better mode coverage. Different from previous works, we explicitly compute the density of the generated samples. This is the key enabler to designing an unbiased estimator of the partition function and computation of the generator entropy term. The generator density is obtained via a new type of flow network, called one-way flow network, that is less constrained in terms of architecture, as it does not require to have a tractable inverse function. Our experimental results show that we converge faster, produce comparable sample quality to GANs with similar architecture, successfully avoid over-fitting to commonly used datasets and produce smooth low-dimensional latent representations of the training data.

READ FULL TEXT

page 5

page 6

research
10/09/2019

Prescribed Generative Adversarial Networks

Generative adversarial networks (GANs) are a powerful approach to unsupe...
research
06/23/2022

LED: Latent Variable-based Estimation of Density

Modern generative models are roughly divided into two main categories: (...
research
01/24/2019

Maximum Entropy Generators for Energy-Based Models

Unsupervised learning is about capturing dependencies between variables ...
research
04/02/2021

Partition-Guided GANs

Despite the success of Generative Adversarial Networks (GANs), their tra...
research
11/04/2019

Improved BiGAN training with marginal likelihood equalization

We propose a novel training procedure for improving the performance of g...
research
10/19/2021

Latent reweighting, an almost free improvement for GANs

Standard formulations of GANs, where a continuous function deforms a con...
research
11/01/2021

Bounds all around: training energy-based models with bidirectional bounds

Energy-based models (EBMs) provide an elegant framework for density esti...

Please sign up or login with your details

Forgot password? Click here to reset