Layerwise Bregman Representation Learning with Applications to Knowledge Distillation

09/15/2022
by   Ehsan Amid, et al.
2

In this work, we propose a novel approach for layerwise representation learning of a trained neural network. In particular, we form a Bregman divergence based on the layer's transfer function and construct an extension of the original Bregman PCA formulation by incorporating a mean vector and normalizing the principal directions with respect to the geometry of the local convex function around the mean. This generalization allows exporting the learned representation as a fixed layer with a non-linearity. As an application to knowledge distillation, we cast the learning problem for the student network as predicting the compression coefficients of the teacher's representations, which are passed as the input to the imported layer. Our empirical findings indicate that our approach is substantially more effective for transferring information between networks than typical teacher-student training using the teacher's penultimate layer representations and soft labels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2022

A Novel Self-Knowledge Distillation Approach with Siamese Representation Learning for Action Recognition

Knowledge distillation is an effective transfer of knowledge from a heav...
research
04/19/2023

Knowledge Distillation Under Ideal Joint Classifier Assumption

Knowledge distillation is a powerful technique to compress large neural ...
research
10/23/2019

Contrastive Representation Distillation

Often we wish to transfer representational knowledge from one neural net...
research
09/17/2021

Distilling Linguistic Context for Language Model Compression

A computationally expensive and memory intensive neural network lies beh...
research
10/26/2020

Activation Map Adaptation for Effective Knowledge Distillation

Model compression becomes a recent trend due to the requirement of deplo...
research
02/06/2023

Audio Representation Learning by Distilling Video as Privileged Information

Deep audio representation learning using multi-modal audio-visual data o...
research
06/20/2023

RoTaR: Efficient Row-Based Table Representation Learning via Teacher-Student Training

We propose RoTaR, a row-based table representation learning method, to a...

Please sign up or login with your details

Forgot password? Click here to reset