Knowledge Distillation as Semiparametric Inference

04/20/2021
by   Tri Dao, et al.
0

A popular approach to model compression is to train an inexpensive student model to mimic the class probabilities of a highly accurate but cumbersome teacher model. Surprisingly, this two-step knowledge distillation process often leads to higher accuracy than training the student directly on labeled data. To explain and enhance this phenomenon, we cast knowledge distillation as a semiparametric inference problem with the optimal student model as the target, the unknown Bayes class probabilities as nuisance, and the teacher probabilities as a plug-in nuisance estimate. By adapting modern semiparametric tools, we derive new guarantees for the prediction error of standard distillation and develop two enhancements – cross-fitting and loss correction – to mitigate the impact of teacher overfitting and underfitting on student performance. We validate our findings empirically on both tabular and image data and observe consistent improvements from our knowledge distillation enhancements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2021

Does Knowledge Distillation Really Work?

Knowledge distillation is a popular technique for training a small stude...
research
05/21/2020

Why distillation helps: a statistical perspective

Knowledge distillation is a technique for improving the performance of a...
research
03/31/2022

Conditional Autoregressors are Interpretable Classifiers

We explore the use of class-conditional autoregressive (CA) models to pe...
research
05/25/2023

Triplet Knowledge Distillation

In Knowledge Distillation, the teacher is generally much larger than the...
research
11/07/2019

Teacher-Student Training for Robust Tacotron-based TTS

While neural end-to-end text-to-speech (TTS) is superior to conventional...
research
07/20/2023

Reverse Knowledge Distillation: Training a Large Model using a Small One for Retinal Image Matching on Limited Data

Retinal image matching plays a crucial role in monitoring disease progre...
research
10/09/2021

Visualizing the embedding space to explain the effect of knowledge distillation

Recent research has found that knowledge distillation can be effective i...

Please sign up or login with your details

Forgot password? Click here to reset