Online Multi-Granularity Distillation for GAN Compression

08/16/2021
by   Yuxi Ren, et al.
16

Generative Adversarial Networks (GANs) have witnessed prevailing success in yielding outstanding images, however, they are burdensome to deploy on resource-constrained devices due to ponderous computational costs and hulking memory usage. Although recent efforts on compressing GANs have acquired remarkable results, they still exist potential model redundancies and can be further compressed. To solve this issue, we propose a novel online multi-granularity distillation (OMGD) scheme to obtain lightweight GANs, which contributes to generating high-fidelity images with low computational demands. We offer the first attempt to popularize single-stage online distillation for GAN-oriented compression, where the progressively promoted teacher generator helps to refine the discriminator-free based student generator. Complementary teacher generators and network layers provide comprehensive and multi-granularity concepts to enhance visual fidelity from diverse dimensions. Experimental results on four benchmark datasets demonstrate that OMGD successes to compress 40x MACs and 82.5X parameters on Pix2Pix and CycleGAN, without loss of image quality. It reveals that OMGD provides a feasible solution for the deployment of real-time image translation on resource-constrained devices. Our code and models are made public at: https://github.com/bytedance/OMGD.

READ FULL TEXT

page 8

page 15

page 16

page 17

page 18

page 19

page 20

page 21

research
10/27/2021

Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme

Recently, a series of algorithms have been explored for GAN compression,...
research
03/05/2021

Teachers Do More Than Teach: Compressing Image-to-Image Models

Generative Adversarial Networks (GANs) have achieved huge success in gen...
research
12/13/2021

DGL-GAN: Discriminator Guided Learning for GAN Compression

Generative Adversarial Networks (GANs) with high computation costs, e.g....
research
12/21/2022

Exploring Content Relationships for Distilling Efficient GANs

This paper proposes a content relationship distillation (CRD) to tackle ...
research
08/25/2020

GAN Slimming: All-in-One GAN Compression by A Unified Optimization Framework

Generative adversarial networks (GANs) have gained increasing popularity...
research
12/29/2022

Discriminator-Cooperated Feature Map Distillation for GAN Compression

Despite excellent performance in image generation, Generative Adversaria...
research
12/05/2018

Model Compression with Generative Adversarial Networks

More accurate machine learning models often demand more computation and ...

Please sign up or login with your details

Forgot password? Click here to reset