Improving the Speed and Quality of GAN by Adversarial Training

08/07/2020
by   Jiachen Zhong, et al.
57

Generative adversarial networks (GAN) have shown remarkable results in image generation tasks. High fidelity class-conditional GAN methods often rely on stabilization techniques by constraining the global Lipschitz continuity. Such regularization leads to less expressive models and slower convergence speed; other techniques, such as the large batch training, require unconventional computing power and are not widely accessible. In this paper, we develop an efficient algorithm, namely FastGAN (Free AdverSarial Training), to improve the speed and quality of GAN training based on the adversarial training technique. We benchmark our method on CIFAR10, a subset of ImageNet, and the full ImageNet datasets. We choose strong baselines such as SNGAN and SAGAN; the results demonstrate that our training algorithm can achieve better generation quality (in terms of the Inception score and Frechet Inception distance) with less overall training time. Most notably, our training algorithm brings ImageNet training to the broader public by requiring 2-4 GPUs.

READ FULL TEXT

page 10

page 16

page 17

page 18

research
09/28/2018

Large Scale GAN Training for High Fidelity Natural Image Synthesis

Despite recent progress in generative image modeling, successfully gener...
research
07/12/2019

Virtual Adversarial Lipschitz Regularization

Generative adversarial networks (GANs) are one of the most popular appro...
research
04/03/2018

Correlated discrete data generation using adversarial training

Generative Adversarial Networks (GAN) have shown great promise in tasks ...
research
01/27/2022

Effective Shortcut Technique for GAN

In recent years, generative adversarial network (GAN)-based image genera...
research
12/06/2022

Rethinking the Objectives of Vector-Quantized Tokenizers for Image Synthesis

Vector-Quantized (VQ-based) generative models usually consist of two bas...
research
08/06/2020

GL-GAN: Adaptive Global and Local Bilevel Optimization model of Image Generation

Although Generative Adversarial Networks have shown remarkable performan...
research
12/02/2019

LOGAN: Latent Optimisation for Generative Adversarial Networks

Training generative adversarial networks requires balancing of delicate ...

Please sign up or login with your details

Forgot password? Click here to reset