WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

02/15/2021
by   Albert No, et al.
0

Generative adversarial networks (GAN) are a widely used class of deep generative models, but their minimax training dynamics are not understood very well. In this work, we show that GANs with a 2-layer infinite-width generator and a 2-layer finite-width discriminator trained with stochastic gradient ascent-descent have no spurious stationary points. We then show that when the width of the generator is finite but wide, there are no spurious stationary points within a ball whose radius becomes arbitrarily large (to cover the entire parameter space) as the width goes to infinity.

READ FULL TEXT

page 8

page 23

page 25

research
10/15/2019

SGD Learns One-Layer Networks in WGANs

Generative adversarial networks (GANs) are a widely used framework for l...
research
08/21/2022

Instability and Local Minima in GAN Training with Kernel Discriminators

Generative Adversarial Networks (GANs) are a widely-used tool for genera...
research
10/16/2017

Commutator width in the first Grigorchuk group

Let G be the first Grigorchuk group. We show that the commutator width o...
research
05/25/2023

Unifying GANs and Score-Based Diffusion as Generative Particle Models

Particle-based deep generative models, such as gradient flows and score-...
research
12/19/2014

On distinguishability criteria for estimating generative models

Two recently introduced criteria for estimation of generative models are...
research
09/25/2019

Asymptotics of Wide Networks from Feynman Diagrams

Understanding the asymptotic behavior of wide networks is of considerabl...
research
01/07/2018

Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models

We propose a new technique that boosts the convergence of training gener...

Please sign up or login with your details

Forgot password? Click here to reset