Self Regulated Learning Mechanism for Data Efficient Knowledge Distillation

02/14/2021
by   Sourav Mishra, et al.
0

Existing methods for distillation use the conventional training approach where all samples participate equally in the process and are thus highly inefficient in terms of data utilization. In this paper, a novel data-efficient approach to transfer the knowledge from a teacher model to a student model is presented. Here, the teacher model uses self-regulation to select appropriate samples for training and identifies their significance in the process. During distillation, the significance information can be used along with the soft-targets to supervise the students. Depending on the use of self-regulation and sample significance information in supervising the knowledge transfer process, three types of distillations are proposed - significance-based, regulated, and hybrid, respectively. Experiments on benchmark datasets show that the proposed methods achieve similar performance as other state-of-the-art methods for knowledge distillation while utilizing a significantly less number of samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2022

A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition

Knowledge distillation is an effective transfer of knowledge from a heav...
research
08/26/2021

Efficient training of lightweight neural networks using Online Self-Acquired Knowledge Distillation

Knowledge Distillation has been established as a highly promising approa...
research
04/05/2023

Self-Distillation for Gaussian Process Regression and Classification

We propose two approaches to extend the notion of knowledge distillation...
research
08/29/2022

How to Teach: Learning Data-Free Knowledge Distillation from Curriculum

Data-free knowledge distillation (DFKD) aims at training lightweight stu...
research
03/20/2023

A closer look at the training dynamics of knowledge distillation

In this paper we revisit the efficacy of knowledge distillation as a fun...
research
04/03/2023

Domain Generalization for Crop Segmentation with Knowledge Distillation

In recent years, precision agriculture has gradually oriented farming cl...
research
02/25/2022

Bridging the Gap Between Patient-specific and Patient-independent Seizure Prediction via Knowledge Distillation

Objective. Deep neural networks (DNN) have shown unprecedented success i...

Please sign up or login with your details

Forgot password? Click here to reset