Model Compression Using Optimal Transport

12/07/2020
by   Suhas Lohit, et al.
0

Model compression methods are important to allow for easier deployment of deep learning models in compute, memory and energy-constrained environments such as mobile phones. Knowledge distillation is a class of model compression algorithm where knowledge from a large teacher network is transferred to a smaller student network thereby improving the student's performance. In this paper, we show how optimal transport-based loss functions can be used for training a student network which encourages learning student network parameters that help bring the distribution of student features closer to that of the teacher features. We present image classification results on CIFAR-100, SVHN and ImageNet and show that the proposed optimal transport loss functions perform comparably to or better than other loss functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2022

Faculty Distillation with Optimal Transport

Knowledge distillation (KD) has shown its effectiveness in improving a s...
research
04/04/2023

Optimal Transport for Correctional Learning

The contribution of this paper is a generalized formulation of correctio...
research
05/01/2020

Distilling Spikes: Knowledge Distillation in Spiking Neural Networks

Spiking Neural Networks (SNN) are energy-efficient computing architectur...
research
05/18/2023

Student-friendly Knowledge Distillation

In knowledge distillation, the knowledge from the teacher model is often...
research
10/29/2018

Learning to Teach with Dynamic Loss Functions

Teaching is critical to human society: it is with teaching that prospect...
research
12/05/2018

Model Compression with Generative Adversarial Networks

More accurate machine learning models often demand more computation and ...
research
07/13/2020

Representation Transfer by Optimal Transport

Deep learning currently provides the best representations of complex obj...

Please sign up or login with your details

Forgot password? Click here to reset