Alleviation for Gradient Exploding in GANs: Fake Can Be Real

12/28/2019
by   Song Tao, et al.
17

In order to alleviate the notorious mode collapse phenomenon in generative adversarial networks (GANs), we propose a novel training method of GANs in which certain fake samples are considered as real ones during the training process. This strategy can reduce the gradient value that generator receives in the region where gradient exploding happens. We show the process of an unbalanced generation and a vicious circle issue resulted from gradient exploding in practical training. We also theoretically prove that gradient exploding can be alleviated with difference penalization for discriminator and fake-as-real consideration for very close real and fake samples . Accordingly, Fake-as-Real GAN (FARGAN) is proposed with a more stable training process and a more faithful generated distribution. Experiments on different datasets verify our theoretical analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset