Data-Free Knowledge Amalgamation via Group-Stack Dual-GAN

03/20/2020
by   Jingwen Ye, et al.
0

Recent advances in deep learning have provided procedures for learning one network to amalgamate multiple streams of knowledge from the pre-trained Convolutional Neural Network (CNN) models, thus reduce the annotation cost. However, almost all existing methods demand massive training data, which may be unavailable due to privacy or transmission issues. In this paper, we propose a data-free knowledge amalgamate strategy to craft a well-behaved multi-task student network from multiple single/multi-task teachers. The main idea is to construct the group-stack generative adversarial networks (GANs) which have two dual generators. First one generator is trained to collect the knowledge by reconstructing the images approximating the original dataset utilized for pre-training the teachers. Then a dual generator is trained by taking the output from the former generator as input. Finally we treat the dual part generator as the target network and regroup it. As demonstrated on several benchmarks of multi-label classification, the proposed method without any training data achieves the surprisingly competitive results, even compared with some full-supervised methods.

READ FULL TEXT

page 3

page 8

research
05/28/2019

Amalgamating Filtered Knowledge: Learning Task-customized Student from Multi-task Teachers

Many well-trained Convolutional Neural Network(CNN) models have now been...
research
04/02/2019

Data-Free Learning of Student Networks

Learning portable neural networks is very essential for computer vision ...
research
12/31/2021

Conditional Generative Data-Free Knowledge Distillation based on Attention Transfer

Knowledge distillation has made remarkable achievements in model compres...
research
09/18/2023

Dual Student Networks for Data-Free Model Stealing

Existing data-free model stealing methods use a generator to produce sam...
research
11/06/2022

Distilling Representations from GAN Generator via Squeeze and Span

In recent years, generative adversarial networks (GANs) have been an act...
research
04/14/2023

PTW: Pivotal Tuning Watermarking for Pre-Trained Image Generators

Deepfakes refer to content synthesized using deep generators, which, whe...
research
07/27/2022

Federated Selective Aggregation for Knowledge Amalgamation

In this paper, we explore a new knowledge-amalgamation problem, termed F...

Please sign up or login with your details

Forgot password? Click here to reset