Function-Consistent Feature Distillation

04/24/2023
by   Dongyang Liu, et al.
0

Feature distillation makes the student mimic the intermediate features of the teacher. Nearly all existing feature-distillation methods use L2 distance or its slight variants as the distance metric between teacher and student features. However, while L2 distance is isotropic w.r.t. all dimensions, the neural network's operation on different dimensions is usually anisotropic, i.e., perturbations with the same 2-norm but in different dimensions of intermediate features lead to changes in the final output with largely different magnitude. Considering this, we argue that the similarity between teacher and student features should not be measured merely based on their appearance (i.e., L2 distance), but should, more importantly, be measured by their difference in function, namely how later layers of the network will read, decode, and process them. Therefore, we propose Function-Consistent Feature Distillation (FCFD), which explicitly optimizes the functional similarity between teacher and student features. The core idea of FCFD is to make teacher and student features not only numerically similar, but more importantly produce similar outputs when fed to the later part of the same network. With FCFD, the student mimics the teacher more faithfully and learns more from the teacher. Extensive experiments on image classification and object detection demonstrate the superiority of FCFD to existing methods. Furthermore, we can combine FCFD with many existing methods to obtain even higher accuracy. Our codes are available at https://github.com/LiuDongyang6/FCFD.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2023

Improving Knowledge Distillation via Regularizing Feature Norm and Direction

Knowledge distillation (KD) exploits a large well-trained model (i.e., t...
research
11/03/2020

In Defense of Feature Mimicking for Knowledge Distillation

Knowledge distillation (KD) is a popular method to train efficient netwo...
research
03/26/2021

Distilling Object Detectors via Decoupled Features

Knowledge distillation is a widely used paradigm for inheriting informat...
research
04/03/2019

A Comprehensive Overhaul of Feature Distillation

We investigate the design aspects of feature distillation methods achiev...
research
07/05/2022

PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient

Knowledge distillation(KD) is a widely-used technique to train compact m...
research
09/06/2023

Knowledge Distillation Layer that Lets the Student Decide

Typical technique in knowledge distillation (KD) is regularizing the lea...
research
08/28/2023

Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection

Knowledge distillation (KD) has shown potential for learning compact mod...

Please sign up or login with your details

Forgot password? Click here to reset