Exploring Content Relationships for Distilling Efficient GANs

12/21/2022
by   Lizhou You, et al.
0

This paper proposes a content relationship distillation (CRD) to tackle the over-parameterized generative adversarial networks (GANs) for the serviceability in cutting-edge devices. In contrast to traditional instance-level distillation, we design a novel GAN compression oriented knowledge by slicing the contents of teacher outputs into multiple fine-grained granularities, such as row/column strips (global information) and image patches (local information), modeling the relationships among them, such as pairwise distance and triplet-wise angle, and encouraging the student to capture these relationships within its output contents. Built upon our proposed content-level distillation, we also deploy an online teacher discriminator, which keeps updating when co-trained with the teacher generator and keeps freezing when co-trained with the student generator for better adversarial training. We perform extensive experiments on three benchmark datasets, the results of which show that our CRD reaches the most complexity reduction on GANs while obtaining the best performance in comparison with existing methods. For example, we reduce MACs of CycleGAN by around 40x and parameters by over 80x, meanwhile, 46.61 FIDs are obtained compared with these of 51.92 for the current state-of-the-art. Code of this project is available at https://github.com/TheKernelZ/CRD.

READ FULL TEXT

page 2

page 7

page 8

page 13

research
12/29/2022

Discriminator-Cooperated Feature Map Distillation for GAN Compression

Despite excellent performance in image generation, Generative Adversaria...
research
02/07/2020

Image Fine-grained Inpainting

Image inpainting techniques have shown promising improvement with the as...
research
08/16/2021

Online Multi-Granularity Distillation for GAN Compression

Generative Adversarial Networks (GANs) have witnessed prevailing success...
research
11/17/2020

Learning Efficient GANs via Differentiable Masks and co-Attention Distillation

Generative Adversarial Networks (GANs) have been widely-used in image tr...
research
10/27/2021

Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme

Recently, a series of algorithms have been explored for GAN compression,...
research
04/06/2021

Content-Aware GAN Compression

Generative adversarial networks (GANs), e.g., StyleGAN2, play a vital ro...
research
03/05/2021

Teachers Do More Than Teach: Compressing Image-to-Image Models

Generative Adversarial Networks (GANs) have achieved huge success in gen...

Please sign up or login with your details

Forgot password? Click here to reset