Sample weighting as an explanation for mode collapse in generative adversarial networks

10/05/2020
by   Aksel Wilhelm Wold Eide, et al.
0

Generative adversarial networks were introduced with a logistic MiniMax cost formulation, which normally fails to train due to saturation, and a Non-Saturating reformulation. While addressing the saturation problem, NS-GAN also inverts the generator's sample weighting, implicitly shifting emphasis from higher-scoring to lower-scoring samples when updating parameters. We present both theory and empirical results suggesting that this makes NS-GAN prone to mode dropping. We design MM-nsat, which preserves MM-GAN sample weighting while avoiding saturation by rescaling the MM-GAN minibatch gradient such that its magnitude approximates NS-GAN's gradient magnitude. MM-nsat has qualitatively different training dynamics, and on MNIST and CIFAR-10 it is stronger in terms of mode coverage, stability and FID. While the empirical results for MM-nsat are promising and favorable also in comparison with the LS-GAN and Hinge-GAN formulations, our main contribution is to show how and why NS-GAN's sample weighting causes mode dropping and training collapse.

READ FULL TEXT

page 32

page 33

page 34

page 35

page 36

page 38

page 40

page 41

research
07/11/2018

On catastrophic forgetting and mode collapse in Generative Adversarial Networks

Generative Adversarial Networks (GAN) are one of the most prominent tool...
research
10/10/2019

Comparison of Generative Adversarial Networks Architectures Which Reduce Mode Collapse

Generative Adversarial Networks are known for their high quality outputs...
research
03/23/2018

Generative Adversarial Autoencoder Networks

We introduce an effective model to overcome the problem of mode collapse...
research
12/12/2019

Coevolution of Generative Adversarial Networks

Generative adversarial networks (GAN) became a hot topic, presenting imp...
research
01/19/2018

Composite Functional Gradient Learning of Generative Adversarial Models

Generative adversarial networks (GAN) have become popular for generating...
research
07/25/2021

Evolutionary Generative Adversarial Networks based on New Fitness Function and Generic Crossover Operator

Evolutionary generative adversarial networks (E-GAN) attempts to allevia...
research
02/14/2020

Top-K Training of GANs: Improving Generators by Making Critics Less Critical

We introduce a simple (one line of code) modification to the Generative ...

Please sign up or login with your details

Forgot password? Click here to reset