Explicit and Implicit Knowledge Distillation via Unlabeled Data

02/17/2023
by   Yuzheng Wang, et al.
0

Data-free knowledge distillation is a challenging model lightweight task for scenarios in which the original dataset is not available. Previous methods require a lot of extra computational costs to update one or more generators and their naive imitate-learning lead to lower distillation efficiency. Based on these observations, we first propose an efficient unlabeled sample selection method to replace high computational generators and focus on improving the training efficiency of the selected samples. Then, a class-dropping mechanism is designed to suppress the label noise caused by the data domain shifts. Finally, we propose a distillation method that incorporates explicit features and implicit structured relations to improve the effect of distillation. Experimental results show that our method can quickly converge and obtain higher accuracy than other state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2020

Classification of Diabetic Retinopathy Using Unlabeled Data and Knowledge Distillation

Knowledge distillation allows transferring knowledge from a pre-trained ...
research
11/13/2019

Learning from a Teacher using Unlabeled Data

Knowledge distillation is a widely used technique for model compression....
research
04/09/2019

Back to the Future: Knowledge Distillation for Human Action Anticipation

We consider the task of training a neural network to anticipate human ac...
research
07/14/2022

Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources

Although more layers and more parameters generally improve the accuracy ...
research
09/09/2020

On the Orthogonality of Knowledge Distillation with Other Techniques: From an Ensemble Perspective

To put a state-of-the-art neural network to practical use, it is necessa...
research
06/02/2021

Not All Knowledge Is Created Equal

Mutual knowledge distillation (MKD) improves a model by distilling knowl...
research
09/04/2023

On the Query Strategies for Efficient Online Active Distillation

Deep Learning (DL) requires lots of time and data, resulting in high com...

Please sign up or login with your details

Forgot password? Click here to reset