Content-Aware GAN Compression

04/06/2021
by   Yuchen Liu, et al.
0

Generative adversarial networks (GANs), e.g., StyleGAN2, play a vital role in various image generation and synthesis tasks, yet their notoriously high computational cost hinders their efficient deployment on edge devices. Directly applying generic compression approaches yields poor results on GANs, which motivates a number of recent GAN compression works. While prior works mainly accelerate conditional GANs, e.g., pix2pix and CycleGAN, compressing state-of-the-art unconditional GANs has rarely been explored and is more challenging. In this paper, we propose novel approaches for unconditional GAN compression. We first introduce effective channel pruning and knowledge distillation schemes specialized for unconditional GANs. We then propose a novel content-aware method to guide the processes of both pruning and distillation. With content-awareness, we can effectively prune channels that are unimportant to the contents of interest, e.g., human faces, and focus our distillation on these regions, which significantly enhances the distillation quality. On StyleGAN2 and SN-GAN, we achieve a substantial improvement over the state-of-the-art compression method. Notably, we reduce the FLOPs of StyleGAN2 by 11x with visually negligible image quality loss compared to the full-size model. More interestingly, when applied to various image manipulation tasks, our compressed model forms a smoother and better disentangled latent manifold, making it more effective for image editing.

READ FULL TEXT

page 7

page 8

page 13

page 14

page 15

page 16

page 17

page 18

research
03/16/2022

PPCD-GAN: Progressive Pruning and Class-Aware Distillation for Large-Scale Conditional GANs Compression

We push forward neural network compression research by exploiting a nove...
research
06/15/2020

AutoGAN-Distiller: Searching to Compress Generative Adversarial Networks

The compression of Generative Adversarial Networks (GANs) has lately dra...
research
02/01/2019

Compressing GANs using Knowledge Distillation

Generative Adversarial Networks (GANs) have been used in several machine...
research
05/31/2021

GANs Can Play Lottery Tickets Too

Deep generative adversarial networks (GANs) have gained growing populari...
research
03/04/2021

Anycost GANs for Interactive Image Synthesis and Editing

Generative adversarial networks (GANs) have enabled photorealistic image...
research
01/07/2022

Microdosing: Knowledge Distillation for GAN based Compression

Recently, significant progress has been made in learned image and video ...
research
12/21/2022

Exploring Content Relationships for Distilling Efficient GANs

This paper proposes a content relationship distillation (CRD) to tackle ...

Please sign up or login with your details

Forgot password? Click here to reset