DeepAI AI Chat
Log In Sign Up

GANs Trained by a Two Time-Scale Update Rule Converge to a Local Nash Equilibrium

06/26/2017
by   Martin Heusel, et al.
Johannes Kepler University Linz
0

Generative Adversarial Networks (GANs) excel at creating realistic images with complex models for which maximum likelihood is infeasible. However, the convergence of GAN training has still not been proved. We propose a two time-scale update rule (TTUR) for training GANs with stochastic gradient descent on arbitrary GAN loss functions. TTUR has an individual learning rate for both the discriminator and the generator. Using the theory of stochastic approximation, we prove that the TTUR converges under mild assumptions to a stationary local Nash equilibrium. The convergence carries over to the popular Adam optimization, for which we prove that it follows the dynamics of a heavy ball with friction and thus prefers flat minima in the objective landscape. For the evaluation of the performance of GANs at image generation, we introduce the "Fréchet Inception Distance" (FID) which captures the similarity of generated images to real ones better than the Inception Score. In experiments, TTUR improves learning for DCGANs and Improved Wasserstein GANs (WGAN-GP) outperforming conventional GAN training on CelebA, CIFAR-10, SVHN, LSUN Bedrooms, and the One Billion Word Benchmark.

READ FULL TEXT

page 13

page 14

page 15

03/28/2022

Conjugate Gradient Method for Generative Adversarial Networks

While the generative model has many advantages, it is not feasible to ca...
01/28/2022

Using Constant Learning Rate of Two Time-Scale Update Rule for Training Generative Adversarial Networks

Previous numerical results have shown that a two time-scale update rule ...
10/09/2022

Dissecting adaptive methods in GANs

Adaptive methods are a crucial component widely used for training genera...
10/13/2019

Implicit competitive regularization in GANs

Generative adversarial networks (GANs) are capable of producing high qua...
01/31/2023

Mind the (optimality) Gap: A Gap-Aware Learning Rate Scheduler for Adversarial Nets

Adversarial nets have proved to be powerful in various domains including...
08/28/2020

Adaptive WGAN with loss change rate balancing

Optimizing the discriminator in Generative Adversarial Networks (GANs) t...
12/01/2021

Convergence of GANs Training: A Game and Stochastic Control Methodology

Training of generative adversarial networks (GANs) is known for its diff...

Code Repositories

Deep-learning-with-cats

Deep learning with cats (^._.^)


view repo