GAN Slimming: All-in-One GAN Compression by A Unified Optimization Framework

08/25/2020
by   Haotao Wang, et al.
16

Generative adversarial networks (GANs) have gained increasing popularity in various computer vision applications, and recently start to be deployed to resource-constrained mobile devices. Similar to other deep models, state-of-the-art GANs suffer from high parameter complexities. That has recently motivated the exploration of compressing GANs (usually generators). Compared to the vast literature and prevailing success in compressing deep classifiers, the study of GAN compression remains in its infancy, so far leveraging individual compression techniques instead of more sophisticated combinations. We observe that due to the notorious instability of training GANs, heuristically stacking different compression techniques will result in unsatisfactory results. To this end, we propose the first unified optimization framework combining multiple compression means for GAN compression, dubbed GAN Slimming (GS). GS seamlessly integrates three mainstream compression techniques: model distillation, channel pruning and quantization, together with the GAN minimax objective, into one unified optimization form, that can be efficiently optimized from end to end. Without bells and whistles, GS largely outperforms existing options in compressing image-to-image translation GANs. Specifically, we apply GS to compress CartoonGAN, a state-of-the-art style transfer network, by up to 47 times, with minimal visual quality degradation. Codes and pre-trained models can be found at https://github.com/TAMU-VITA/GAN-Slimming.

READ FULL TEXT

page 2

page 11

page 14

page 19

research
06/15/2020

AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks

The compression of Generative Adversarial Networks (GANs) has lately dra...
research
05/31/2021

GANs Can Play Lottery Tickets Too

Deep generative adversarial networks (GANs) have gained growing populari...
research
12/13/2021

DGL-GAN: Discriminator Guided Learning for GAN Compression

Generative Adversarial Networks (GANs) with high computation costs, e.g....
research
08/16/2021

Online Multi-Granularity Distillation for GAN Compression

Generative Adversarial Networks (GANs) have witnessed prevailing success...
research
03/16/2022

PPCD-GAN: Progressive Pruning and Class-Aware Distillation for Large-Scale Conditional GANs Compression

We push forward neural network compression research by exploiting a nove...
research
03/17/2020

Blur, Noise, and Compression Robust Generative Adversarial Networks

Recently, generative adversarial networks (GANs), which learn data distr...
research
07/03/2020

Self-Supervised GAN Compression

Deep learning's success has led to larger and larger models to handle mo...

Please sign up or login with your details

Forgot password? Click here to reset