Progressive Label Distillation: Learning Input-Efficient Deep Neural Networks

01/26/2019
by   Zhong Qiu Lin, et al.
0

Much of the focus in the area of knowledge distillation has been on distilling knowledge from a larger teacher network to a smaller student network. However, there has been little research on how the concept of distillation can be leveraged to distill the knowledge encapsulated in the training data itself into a reduced form. In this study, we explore the concept of progressive label distillation, where we leverage a series of teacher-student network pairs to progressively generate distilled training data for learning deep neural networks with greatly reduced input dimensions. To investigate the efficacy of the proposed progressive label distillation approach, we experimented with learning a deep limited vocabulary speech recognition network based on generated 500ms input utterances distilled progressively from 1000ms source training data, and demonstrated a significant increase in test accuracy of almost 78

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2022

PROD: Progressive Distillation for Dense Retrieval

Knowledge distillation is an effective way to transfer knowledge from a ...
research
10/04/2022

Learning Deep Nets for Gravitational Dynamics with Unknown Disturbance through Physical Knowledge Distillation: Initial Feasibility Study

Learning high-performance deep neural networks for dynamic modeling of h...
research
09/28/2020

Kernel Based Progressive Distillation for Adder Neural Networks

Adder Neural Networks (ANNs) which only contain additions bring us a new...
research
04/25/2018

Progressive Neural Networks for Image Classification

The inference structures and computational complexity of existing deep n...
research
03/09/2023

Learn More for Food Recognition via Progressive Self-Distillation

Food recognition has a wide range of applications, such as health-aware ...
research
06/13/2019

Linear Distillation Learning

Deep Linear Networks do not have expressive power but they are mathemati...
research
05/30/2018

Collaborative Learning for Deep Neural Networks

We introduce collaborative learning in which multiple classifier heads o...

Please sign up or login with your details

Forgot password? Click here to reset