Does Knowledge Distillation Really Work?

06/10/2021
by   Samuel Stanton, et al.
0

Knowledge distillation is a popular technique for training a small student network to emulate a larger teacher model, such as an ensemble of networks. We show that while knowledge distillation can improve student generalization, it does not typically work as it is commonly understood: there often remains a surprisingly large discrepancy between the predictive distributions of the teacher and the student, even in cases when the student has the capacity to perfectly match the teacher. We identify difficulties in optimization as a key reason for why the student is unable to match the teacher. We also show how the details of the dataset used for distillation play a role in how closely the student matches the teacher – and that more closely matching the teacher paradoxically does not always lead to better student generalization.

READ FULL TEXT
research
04/24/2023

Improving Knowledge Distillation Via Transferring Learning Ability

Existing knowledge distillation methods generally use a teacher-student ...
research
05/16/2023

Tailoring Instructions to Student's Learning Levels Boosts Knowledge Distillation

It has been commonly observed that a teacher model with superior perform...
research
10/08/2019

Knowledge Distillation from Internal Representations

Knowledge distillation is typically conducted by training a small model ...
research
10/31/2021

Rethinking the Knowledge Distillation From the Perspective of Model Calibration

Recent years have witnessed dramatically improvements in the knowledge d...
research
10/10/2020

Structural Knowledge Distillation

Knowledge distillation is a critical technique to transfer knowledge bet...
research
10/03/2019

On the Efficacy of Knowledge Distillation

In this paper, we present a thorough evaluation of the efficacy of knowl...
research
04/20/2021

Knowledge Distillation as Semiparametric Inference

A popular approach to model compression is to train an inexpensive stude...

Please sign up or login with your details

Forgot password? Click here to reset