Stagewise Knowledge Distillation

11/15/2019
by   Akshay Kulkarni, et al.
0

The deployment of modern Deep Learning models requires high computational power. However, many applications are targeted for embedded devices like smartphones and wearables which lack such computational abilities. This necessitates compact networks which reduce computations while preserving the performance. Knowledge Distillation is one of the methods used to achieve this. Traditional Knowledge Distillation methods transfer knowledge from teacher to student in a single stage. We propose progressive stagewise training to improve the transfer of knowledge. We also show that this method works even with a fraction of the data used for training the teacher model, without compromising on the metric. This method can complement other model compression methods and also can be viewed as a generalized model compression technique.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2020

Knowledge Distillation in Deep Learning and its Applications

Deep learning based models are relatively large, and it is hard to deplo...
research
07/14/2022

Large-scale Knowledge Distillation with Elastic Heterogeneous Computing Resources

Although more layers and more parameters generally improve the accuracy ...
research
11/09/2020

Knowledge Distillation for Singing Voice Detection

Singing Voice Detection (SVD) has been an active area of research in mus...
research
07/20/2021

Follow Your Path: a Progressive Method for Knowledge Distillation

Deep neural networks often have a huge number of parameters, which posts...
research
03/16/2023

Knowledge Distillation for Adaptive MRI Prostate Segmentation Based on Limit-Trained Multi-Teacher Models

With numerous medical tasks, the performance of deep models has recently...
research
12/17/2020

Towards Understanding Ensemble, Knowledge Distillation and Self-Distillation in Deep Learning

We formally study how Ensemble of deep learning models can improve test ...

Please sign up or login with your details

Forgot password? Click here to reset