Ultra-Data-Efficient GAN Training: Drawing A Lottery Ticket First, Then Training It Toughly

02/28/2021
by   Tianlong Chen, et al.
0

Training generative adversarial networks (GANs) with limited data generally results in deteriorated performance and collapsed models. To conquer this challenge, we are inspired by the latest observation of Kalibhat et al. (2020); Chen et al.(2021d), that one can discover independently trainable and highly sparse subnetworks (a.k.a., lottery tickets) from GANs. Treating this as an inductive prior, we decompose the data-hungry GAN training into two sequential sub-problems: (i) identifying the lottery ticket from the original GAN; then (ii) training the found sparse subnetwork with aggressive data and feature augmentations. Both sub-problems re-use the same small training set of real images. Such a coordinated framework enables us to focus on lower-complexity and more data-efficient sub-problems, effectively stabilizing training and improving convergence. Comprehensive experiments endorse the effectiveness of our proposed ultra-data-efficient training framework, across various GAN architectures (SNGAN, BigGAN, and StyleGAN2) and diverse datasets (CIFAR-10, CIFAR-100, Tiny-ImageNet, and ImageNet). Besides, our training framework also displays powerful few-shot generalization ability, i.e., generating high-fidelity images by training from scratch with just 100 real images, without any pre-training. Codes are available at: https://github.com/VITA-Group/Ultra-Data-Efficient-GAN-Training.

READ FULL TEXT

page 7

page 14

research
06/18/2020

Differentiable Augmentation for Data-Efficient GAN Training

The performance of generative adversarial networks (GANs) heavily deteri...
research
04/13/2017

On the Effects of Batch and Weight Normalization in Generative Adversarial Networks

Generative adversarial networks (GANs) are highly effective unsupervised...
research
12/13/2021

DGL-GAN: Discriminator Guided Learning for GAN Compression

Generative Adversarial Networks (GANs) with high computation costs, e.g....
research
01/12/2021

Towards Faster and Stabilized GAN Training for High-fidelity Few-shot Image Synthesis

Training Generative Adversarial Networks (GAN) on high-fidelity images u...
research
03/05/2022

Don't Be So Dense: Sparse-to-Sparse GAN Training Without Sacrificing Performance

Generative adversarial networks (GANs) have received an upsurging intere...
research
06/12/2018

The Unusual Effectiveness of Averaging in GAN Training

We show empirically that the optimal strategy of parameter averaging in ...
research
10/29/2019

Small-GAN: Speeding Up GAN Training Using Core-sets

Recent work by Brock et al. (2018) suggests that Generative Adversarial ...

Please sign up or login with your details

Forgot password? Click here to reset