DeepAI AI Chat
Log In Sign Up

WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

by   Albert No, et al.

Generative adversarial networks (GAN) are a widely used class of deep generative models, but their minimax training dynamics are not understood very well. In this work, we show that GANs with a 2-layer infinite-width generator and a 2-layer finite-width discriminator trained with stochastic gradient ascent-descent have no spurious stationary points. We then show that when the width of the generator is finite but wide, there are no spurious stationary points within a ball whose radius becomes arbitrarily large (to cover the entire parameter space) as the width goes to infinity.


page 8

page 23

page 25


SGD Learns One-Layer Networks in WGANs

Generative adversarial networks (GANs) are a widely used framework for l...

Instability and Local Minima in GAN Training with Kernel Discriminators

Generative Adversarial Networks (GANs) are a widely-used tool for genera...

Commutator width in the first Grigorchuk group

Let G be the first Grigorchuk group. We show that the commutator width o...

Unifying GANs and Score-Based Diffusion as Generative Particle Models

Particle-based deep generative models, such as gradient flows and score-...

On distinguishability criteria for estimating generative models

Two recently introduced criteria for estimation of generative models are...

Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models

We propose a new technique that boosts the convergence of training gener...

Collective evolution of weights in wide neural networks

We derive a nonlinear integro-differential transport equation describing...

Code Repositories


This is public repository for 'WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points'

view repo