Mapping Emulation for Knowledge Distillation

05/21/2022
by   Jing Ma, et al.
0

This paper formalizes the source-blind knowledge distillation problem that is essential to federated learning. A new geometric perspective is presented to view such a problem as aligning generated distributions between the teacher and student. With its guidance, a new architecture MEKD is proposed to emulate the inverse mapping through generative adversarial training. Unlike mimicking logits and aligning logit distributions, reconstructing the mapping from classifier-logits has a geometric intuition of decreasing empirical distances, and theoretical guarantees using the universal function approximation and optimal mass transportation theories. A new algorithm is also proposed to train the student model that reaches the teacher's performance source-blindly. On various benchmarks, MEKD outperforms existing source-blind KD methods, explainable with ablation studies and visualized results.

READ FULL TEXT
research
11/01/2022

ARDIR: Improving Robustness using Knowledge Distillation of Internal Representation

Adversarial training is the most promising method for learning robust mo...
research
03/14/2022

On the benefits of knowledge distillation for adversarial robustness

Knowledge distillation is normally used to compress a big network, or te...
research
10/22/2022

Hard Gate Knowledge Distillation – Leverage Calibration for Robust and Reliable Language Model

In knowledge distillation, a student model is trained with supervisions ...
research
11/30/2022

Hint-dynamic Knowledge Distillation

Knowledge Distillation (KD) transfers the knowledge from a high-capacity...
research
04/25/2022

Faculty Distillation with Optimal Transport

Knowledge distillation (KD) has shown its effectiveness in improving a s...
research
08/24/2020

Geometric and statistical techniques for projective mapping of chocolate chip cookies with a large number of consumers

The so-called rapid sensory methods have proved to be useful for the sen...

Please sign up or login with your details

Forgot password? Click here to reset