Exploring the Connection between Knowledge Distillation and Logits Matching

09/14/2021
by   Defang Chen, et al.
0

Knowledge distillation is a generalized logits matching technique for model compression. Their equivalence is previously established on the condition of infinity temperature and zero-mean normalization. In this paper, we prove that with only infinity temperature, the effect of knowledge distillation equals to logits matching with an extra regularization. Furthermore, we reveal that an additional weaker condition – equal-mean initialization rather than the original zero-mean normalization already suffices to set up the equivalence. The key to our proof is we realize that in modern neural networks with the cross-entropy loss and softmax activation, the mean of back-propagated gradient on logits always keeps zero.

READ FULL TEXT

page 1

page 2

research
12/01/2018

On Compressing U-net Using Knowledge Distillation

We study the use of knowledge distillation to compress the U-net archite...
research
08/01/2023

NormKD: Normalized Logits for Knowledge Distillation

Logit based knowledge distillation gets less attention in recent years s...
research
03/09/2020

Knowledge distillation via adaptive instance normalization

This paper addresses the problem of model compression via knowledge dist...
research
12/08/2022

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

To circumvent the non-parallelizability of recurrent neural network-base...
research
09/04/2019

Empirical Analysis of Knowledge Distillation Technique for Optimization of Quantized Deep Neural Networks

Knowledge distillation (KD) is a very popular method for model size redu...
research
05/23/2023

Decoupled Kullback-Leibler Divergence Loss

In this paper, we delve deeper into the Kullback-Leibler (KL) Divergence...
research
02/23/2023

Practical Knowledge Distillation: Using DNNs to Beat DNNs

For tabular data sets, we explore data and model distillation, as well a...

Please sign up or login with your details

Forgot password? Click here to reset