Distilling Representations from GAN Generator via Squeeze and Span

11/06/2022
by   Yu Yang, et al.
0

In recent years, generative adversarial networks (GANs) have been an actively studied topic and shown to successfully produce high-quality realistic images in various domains. The controllable synthesis ability of GAN generators suggests that they maintain informative, disentangled, and explainable image representations, but leveraging and transferring their representations to downstream tasks is largely unexplored. In this paper, we propose to distill knowledge from GAN generators by squeezing and spanning their representations. We squeeze the generator features into representations that are invariant to semantic-preserving transformations through a network before they are distilled into the student network. We span the distilled representation of the synthetic domain to the real domain by also using real training data to remedy the mode collapse of GANs and boost the student network performance in a real domain. Experiments justify the efficacy of our method and reveal its great significance in self-supervised representation learning. Code is available at https://github.com/yangyu12/squeeze-and-span.

READ FULL TEXT
research
11/03/2020

Exploring DeshuffleGANs in Self-Supervised Generative Adversarial Networks

Generative Adversarial Networks (GANs) have become the most used network...
research
07/05/2022

RepMix: Representation Mixing for Robust Attribution of Synthesized Images

Rapid advances in Generative Adversarial Networks (GANs) raise new chall...
research
04/18/2023

Look ATME: The Discriminator Mean Entropy Needs Attention

Generative adversarial networks (GANs) are successfully used for image s...
research
03/04/2021

Anycost GANs for Interactive Image Synthesis and Editing

Generative adversarial networks (GANs) have enabled photorealistic image...
research
06/16/2021

Disentangling Semantic-to-visual Confusion for Zero-shot Learning

Using generative models to synthesize visual features from semantic dist...
research
03/20/2020

Data-Free Knowledge Amalgamation via Group-Stack Dual-GAN

Recent advances in deep learning have provided procedures for learning o...
research
01/17/2022

Collapse by Conditioning: Training Class-conditional GANs with Limited Data

Class-conditioning offers a direct means of controlling a Generative Adv...

Please sign up or login with your details

Forgot password? Click here to reset