Improving sample diversity of a pre-trained, class-conditional GAN by changing its class embeddings

10/10/2019
by   Qi Li, et al.
22

Mode collapse is a well-known issue with Generative Adversarial Networks (GANs) and is a byproduct of unstable GAN training. We propose to improve the sample diversity of a pre-trained class-conditional generator by modifying its class embeddings in the direction of maximizing the log probability outputs of a classifier pre-trained on the same dataset. We improved the sample diversity of state-of-the-art ImageNet BigGANs at both 128x128 and 256x256 resolutions. By replacing the embeddings, we can also synthesize plausible images for Places365 using a BigGAN pre-trained on ImageNet.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset