Distillation from heterogeneous unlabeled collections

01/17/2022
by   Jean-Michel Begon, et al.
0

Compressing deep networks is essential to expand their range of applications to constrained settings. The need for compression however often arises long after the model was trained, when the original data might no longer be available. On the other hand, unlabeled data, not necessarily related to the target task, is usually plentiful, especially in image classification tasks. In this work, we propose a scheme to leverage such samples to distill the knowledge learned by a large teacher network to a smaller student. The proposed technique relies on (i) preferentially sampling datapoints that appear related, and (ii) taking better advantage of the learning signal. We show that the former speeds up the student's convergence, while the latter boosts its performance, achieving performances closed to what can be expected with the original data.

READ FULL TEXT
research
11/13/2019

Learning from a Teacher using Unlabeled Data

Knowledge distillation is a widely used technique for model compression....
research
01/11/2021

Resolution-Based Distillation for Efficient Histology Image Classification

Developing deep learning models to analyze histology images has been com...
research
08/20/2022

Effectiveness of Function Matching in Driving Scene Recognition

Knowledge distillation is an effective approach for training compact rec...
research
11/24/2019

DeepMimic: Mentor-Student Unlabeled Data Based Training

In this paper, we present a deep neural network (DNN) training approach ...
research
04/07/2021

Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression

Knowledge distillation (KD) has been actively studied for image classifi...
research
10/20/2020

BERT2DNN: BERT Distillation with Massive Unlabeled Data for Online E-Commerce Search

Relevance has significant impact on user experience and business profit ...

Please sign up or login with your details

Forgot password? Click here to reset