Teachers Do More Than Teach: Compressing Image-to-Image Models

03/05/2021
by   Qing Jin, et al.
0

Generative Adversarial Networks (GANs) have achieved huge success in generating high-fidelity images, however, they suffer from low efficiency due to tremendous computational cost and bulky memory usage. Recent efforts on compression GANs show noticeable progress in obtaining smaller generators by sacrificing image quality or involving a time-consuming searching process. In this work, we aim to address these issues by introducing a teacher network that provides a search space in which efficient network architectures can be found, in addition to performing knowledge distillation. First, we revisit the search space of generative models, introducing an inception-based residual block into generators. Second, to achieve target computation cost, we propose a one-step pruning algorithm that searches a student architecture from the teacher model and substantially reduces searching cost. It requires no l1 sparsity regularization and its associated hyper-parameters, simplifying the training procedure. Finally, we propose to distill knowledge through maximizing feature similarity between teacher and student via an index named Global Kernel Alignment (GKA). Our compressed networks achieve similar or even better image fidelity (FID, mIoU) than the original models with much-reduced computational cost, e.g., MACs. Code will be released at https://github.com/snap-research/CAT.

READ FULL TEXT

page 3

page 7

page 8

page 14

page 15

page 16

page 17

page 18

research
08/16/2021

Online Multi-Granularity Distillation for GAN Compression

Generative Adversarial Networks (GANs) have witnessed prevailing success...
research
05/25/2022

Region-aware Knowledge Distillation for Efficient Image-to-Image Translation

Recent progress in image-to-image translation has witnessed the success ...
research
03/07/2020

Distilling portable Generative Adversarial Networks for Image Translation

Despite Generative Adversarial Networks (GANs) have been widely used in ...
research
06/15/2020

AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks

The compression of Generative Adversarial Networks (GANs) has lately dra...
research
06/16/2020

AlphaGAN: Fully Differentiable Architecture Search for Generative Adversarial Networks

Generative Adversarial Networks (GANs) are formulated as minimax game pr...
research
12/21/2022

Exploring Content Relationships for Distilling Efficient GANs

This paper proposes a content relationship distillation (CRD) to tackle ...

Please sign up or login with your details

Forgot password? Click here to reset