Information-Theoretic GAN Compression with Variational Energy-based Model

03/28/2023
by   Minsoo Kang, et al.
0

We propose an information-theoretic knowledge distillation approach for the compression of generative adversarial networks, which aims to maximize the mutual information between teacher and student networks via a variational optimization based on an energy-based model. Because the direct computation of the mutual information in continuous domains is intractable, our approach alternatively optimizes the student network by maximizing the variational lower bound of the mutual information. To achieve a tight lower bound, we introduce an energy-based model relying on a deep neural network to represent a flexible variational distribution that deals with high-dimensional images and consider spatial dependencies between pixels, effectively. Since the proposed method is a generic optimization algorithm, it can be conveniently incorporated into arbitrary generative adversarial networks and even dense prediction networks, e.g., image enhancement models. We demonstrate that the proposed algorithm achieves outstanding performance in model compression of generative adversarial networks consistently when combined with several existing models.

READ FULL TEXT

page 10

page 19

page 21

page 22

page 23

page 24

page 25

page 26

research
12/02/2019

Information bottleneck through variational glasses

Information bottleneck (IB) principle [1] has become an important elemen...
research
10/29/2021

Estimating and Maximizing Mutual Information for Knowledge Distillation

In this work, we propose Mutual Information Maximization Knowledge Disti...
research
10/12/2021

Information Theoretic Structured Generative Modeling

Rényi's information provides a theoretical foundation for tractable and ...
research
12/05/2018

Model Compression with Generative Adversarial Networks

More accurate machine learning models often demand more computation and ...
research
06/12/2016

InfoGAN: Interpretable Representation Learning by Information Maximizing Generative Adversarial Nets

This paper describes InfoGAN, an information-theoretic extension to the ...
research
05/25/2019

The variational infomax autoencoder

We propose the Variational InfoMax AutoEncoder (VIMAE), a method to trai...
research
11/16/2020

Regularized Mutual Information Neural Estimation

With the variational lower bound of mutual information (MI), the estimat...

Please sign up or login with your details

Forgot password? Click here to reset