Discriminator-Cooperated Feature Map Distillation for GAN Compression

12/29/2022
by   Tie Hu, et al.
0

Despite excellent performance in image generation, Generative Adversarial Networks (GANs) are notorious for its requirements of enormous storage and intensive computation. As an awesome ”performance maker”, knowledge distillation is demonstrated to be particularly efficacious in exploring low-priced GANs. In this paper, we investigate the irreplaceability of teacher discriminator and present an inventive discriminator-cooperated distillation, abbreviated as DCD, towards refining better feature maps from the generator. In contrast to conventional pixel-to-pixel match methods in feature map distillation, our DCD utilizes teacher discriminator as a transformation to drive intermediate results of the student generator to be perceptually close to corresponding outputs of the teacher generator. Furthermore, in order to mitigate mode collapse in GAN compression, we construct a collaborative adversarial training paradigm where the teacher discriminator is from scratch established to co-train with student generator in company with our DCD. Our DCD shows superior results compared with existing GAN compression methods. For instance, after reducing over 40x MACs and 80x parameters of CycleGAN, we well decrease FID metric from 61.53 to 48.24 while the current SoTA method merely has 51.92. This work's source code has been made accessible at https://github.com/poopit/DCD-official.

READ FULL TEXT

page 7

page 8

page 13

page 14

research
12/21/2022

Exploring Content Relationships for Distilling Efficient GANs

This paper proposes a content relationship distillation (CRD) to tackle ...
research
10/27/2021

Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme

Recently, a series of algorithms have been explored for GAN compression,...
research
12/13/2021

DGL-GAN: Discriminator Guided Learning for GAN Compression

Generative Adversarial Networks (GANs) with high computation costs, e.g....
research
11/17/2020

Learning Efficient GANs via Differentiable Masks and co-Attention Distillation

Generative Adversarial Networks (GANs) have been widely-used in image tr...
research
08/16/2021

Online Multi-Granularity Distillation for GAN Compression

Generative Adversarial Networks (GANs) have witnessed prevailing success...
research
03/16/2022

PPCD-GAN: Progressive Pruning and Class-Aware Distillation for Large-Scale Conditional GANs Compression

We push forward neural network compression research by exploiting a nove...
research
12/05/2018

Model Compression with Generative Adversarial Networks

More accurate machine learning models often demand more computation and ...

Please sign up or login with your details

Forgot password? Click here to reset