Normalized Feature Distillation for Semantic Segmentation

07/12/2022
by   Tao Liu, et al.
10

As a promising approach in model compression, knowledge distillation improves the performance of a compact model by transferring the knowledge from a cumbersome one. The kind of knowledge used to guide the training of the student is important. Previous distillation methods in semantic segmentation strive to extract various forms of knowledge from the features, which involve elaborate manual design relying on prior information and have limited performance gains. In this paper, we propose a simple yet effective feature distillation method called normalized feature distillation (NFD), aiming to enable effective distillation with the original features without the need to manually design new forms of knowledge. The key idea is to prevent the student from focusing on imitating the magnitude of the teacher's feature response by normalization. Our method achieves state-of-the-art distillation results for semantic segmentation on Cityscapes, VOC 2012, and ADE20K datasets. Code will be available.

READ FULL TEXT
research
05/07/2022

Distilling Inter-Class Distance for Semantic Segmentation

Knowledge distillation is widely adopted in semantic segmentation to red...
research
03/23/2023

A Simple and Generic Framework for Feature Distillation via Channel-wise Transformation

Knowledge distillation is a popular technique for transferring the knowl...
research
10/31/2019

Distilling Pixel-Wise Feature Similarities for Semantic Segmentation

Among the neural network compression techniques, knowledge distillation ...
research
05/06/2023

Structural and Statistical Texture Knowledge Distillation for Semantic Segmentation

Existing knowledge distillation works for semantic segmentation mainly f...
research
09/07/2023

Towards Comparable Knowledge Distillation in Semantic Image Segmentation

Knowledge Distillation (KD) is one proposed solution to large model size...
research
04/03/2019

A Comprehensive Overhaul of Feature Distillation

We investigate the design aspects of feature distillation methods achiev...
research
12/23/2019

Data-Free Adversarial Distillation

Knowledge Distillation (KD) has made remarkable progress in the last few...

Please sign up or login with your details

Forgot password? Click here to reset