DeepAI AI Chat
Log In Sign Up

WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points

02/15/2021
by   Albert No, et al.
0

Generative adversarial networks (GAN) are a widely used class of deep generative models, but their minimax training dynamics are not understood very well. In this work, we show that GANs with a 2-layer infinite-width generator and a 2-layer finite-width discriminator trained with stochastic gradient ascent-descent have no spurious stationary points. We then show that when the width of the generator is finite but wide, there are no spurious stationary points within a ball whose radius becomes arbitrarily large (to cover the entire parameter space) as the width goes to infinity.

READ FULL TEXT

page 8

page 23

page 25

10/15/2019

SGD Learns One-Layer Networks in WGANs

Generative adversarial networks (GANs) are a widely used framework for l...
08/21/2022

Instability and Local Minima in GAN Training with Kernel Discriminators

Generative Adversarial Networks (GANs) are a widely-used tool for genera...
10/16/2017

Commutator width in the first Grigorchuk group

Let G be the first Grigorchuk group. We show that the commutator width o...
05/25/2023

Unifying GANs and Score-Based Diffusion as Generative Particle Models

Particle-based deep generative models, such as gradient flows and score-...
12/19/2014

On distinguishability criteria for estimating generative models

Two recently introduced criteria for estimation of generative models are...
01/07/2018

Gradient Layer: Enhancing the Convergence of Adversarial Training for Generative Models

We propose a new technique that boosts the convergence of training gener...
10/09/2018

Collective evolution of weights in wide neural networks

We derive a nonlinear integro-differential transport equation describing...

Code Repositories

Infinite-WGAN

This is public repository for 'WGAN with an Infinitely Wide Generator Has No Spurious Stationary Points'


view repo