Which Training Methods for GANs do actually Converge?

01/13/2018
by   Lars Mescheder, et al.
0

Recent work has shown local convergence of GAN training for absolutely continuous data and generator distributions. In this paper, we show that the requirement of absolute continuity is necessary: we describe a simple yet prototypical counterexample showing that in the more realistic case of distributions that are not absolutely continuous, unregularized GAN training is not always convergent. Furthermore, we discuss regularization strategies that were recently proposed to stabilize GAN training. Our analysis shows that GAN training with instance noise or zero-centered gradient penalties converges. On the other hand, we show that Wasserstein-GANs and WGAN-GP with a finite number of discriminator updates per generator update do not always converge to the equilibrium point. We discuss these results, leading us to a new explanation for the stability problems of GAN training. Based on our analysis, we extend our convergence results to more general GANs and prove local convergence for simplified gradient penalties even if the generator and data distribution lie on lower dimensional manifolds. We find these penalties to work well in practice and use them to learn a generative image model of all 1000 Imagenet classes in a single GAN with little hyperparameter tuning.

READ FULL TEXT

page 26

page 27

page 28

page 29

research
01/13/2018

On the convergence properties of GAN training

Recent work has shown local convergence of GAN training for absolutely c...
research
06/13/2017

Gradient descent GAN optimization is locally stable

Despite the growing prominence of generative adversarial networks (GANs)...
research
02/05/2019

Perturbative GAN: GAN with Perturbation Layers

Perturbative GAN, which replaces convolution layers of existing convolut...
research
07/14/2017

f-GANs in an Information Geometric Nutshell

Nowozin et al showed last year how to extend the GAN principle to all f-...
research
01/26/2017

Wasserstein GAN

We introduce a new algorithm named WGAN, an alternative to traditional G...
research
02/14/2020

Top-K Training of GANs: Improving Generators by Making Critics Less Critical

We introduce a simple (one line of code) modification to the Generative ...
research
05/12/2022

α-GAN: Convergence and Estimation Guarantees

We prove a two-way correspondence between the min-max optimization of ge...

Please sign up or login with your details

Forgot password? Click here to reset