Evaluating GANs via Duality

11/13/2018
by   Paulina Grnarova, et al.
6

Generative Adversarial Networks (GANs) have shown great results in accurately modeling complex distributions, but their training is known to be difficult due to instabilities caused by a challenging minimax optimization problem. This is especially troublesome given the lack of an evaluation metric that can reliably detect non-convergent behaviors. We leverage the notion of duality gap from game theory in order to propose a novel convergence metric for GANs that has low computational cost. We verify the validity of the proposed metric for various test scenarios commonly used in the literature.

READ FULL TEXT

page 7

page 9

page 14

research
06/29/2018

Convergence Problems with Generative Adversarial Networks (GANs)

Generative adversarial networks (GANs) are a novel approach to generativ...
research
05/11/2021

Characterizing GAN Convergence Through Proximal Duality Gap

Despite the accomplishments of Generative Adversarial Networks (GANs) in...
research
06/10/2017

An Online Learning Approach to Generative Adversarial Networks

We consider the problem of training generative models with a Generative ...
research
07/12/2021

Hidden Convexity of Wasserstein GANs: Interpretable Generative Models with Closed-Form Solutions

Generative Adversarial Networks (GANs) are commonly used for modeling co...
research
12/16/2021

An Unsupervised Way to Understand Artifact Generating Internal Units in Generative Neural Networks

Despite significant improvements on the image generation performance of ...
research
12/12/2020

On Duality Gap as a Measure for Monitoring GAN Training

Generative adversarial network (GAN) is among the most popular deep lear...
research
10/07/2020

Training GANs with predictive projection centripetal acceleration

Although remarkable successful in practice, training generative adversar...

Please sign up or login with your details

Forgot password? Click here to reset