Don't Be So Dense: Sparse-to-Sparse GAN Training Without Sacrificing Performance

03/05/2022
by   Shiwei Liu, et al.
0

Generative adversarial networks (GANs) have received an upsurging interest since being proposed due to the high quality of the generated data. While achieving increasingly impressive results, the resource demands associated with the large model size hinders the usage of GANs in resource-limited scenarios. For inference, the existing model compression techniques can reduce the model complexity with comparable performance. However, the training efficiency of GANs has less been explored due to the fragile training process of GANs. In this paper, we, for the first time, explore the possibility of directly training sparse GAN from scratch without involving any dense or pre-training steps. Even more unconventionally, our proposed method enables directly training sparse unbalanced GANs with an extremely sparse generator from scratch. Instead of training full GANs, we start with sparse GANs and dynamically explore the parameter space spanned over the generator throughout training. Such a sparse-to-sparse training procedure enhances the capacity of the highly sparse generator progressively while sticking to a fixed small parameter budget with appealing training and inference efficiency gains. Extensive experiments with modern GAN architectures validate the effectiveness of our method. Our sparsified GANs, trained from scratch in one single run, are able to outperform the ones learned by expensive iterative pruning and re-training. Perhaps most importantly, we find instead of inheriting parameters from expensive pre-trained GANs, directly training sparse GANs from scratch can be a much more efficient solution. For example, only training with a 80 generator and a 70 performance than the dense BigGAN.

READ FULL TEXT
research
02/28/2023

Double Dynamic Sparse Training for GANs

The past decade has witnessed a drastic increase in modern deep neural n...
research
05/30/2022

Superposing Many Tickets into One: A Performance Booster for Sparse Neural Network Training

Recent works on sparse neural network training (sparse training) have sh...
research
01/04/2021

Guiding GANs: How to control non-conditional pre-trained GANs for conditional image generation

Generative Adversarial Networks (GANs) are an arrange of two neural netw...
research
02/28/2021

Ultra-Data-Efficient GAN Training: Drawing A Lottery Ticket First, Then Training It Toughly

Training generative adversarial networks (GANs) with limited data genera...
research
05/22/2020

Host-Pathongen Co-evolution Inspired Algorithm Enables Robust GAN Training

Generative adversarial networks (GANs) are pairs of artificial neural ne...
research
08/20/2019

Sparse Generative Adversarial Network

We propose a new approach to Generative Adversarial Networks (GANs) to a...
research
09/07/2022

Supervised GAN Watermarking for Intellectual Property Protection

We propose a watermarking method for protecting the Intellectual Propert...

Please sign up or login with your details

Forgot password? Click here to reset