Distilling Inter-Class Distance for Semantic Segmentation

05/07/2022
by   Zhengbo Zhang, et al.
0

Knowledge distillation is widely adopted in semantic segmentation to reduce the computation cost.The previous knowledge distillation methods for semantic segmentation focus on pixel-wise feature alignment and intra-class feature variation distillation, neglecting to transfer the knowledge of the inter-class distance in the feature space, which is important for semantic segmentation. To address this issue, we propose an Inter-class Distance Distillation (IDD) method to transfer the inter-class distance in the feature space from the teacher network to the student network. Furthermore, semantic segmentation is a position-dependent task,thus we exploit a position information distillation module to help the student network encode more position information. Extensive experiments on three popular datasets: Cityscapes, Pascal VOC and ADE20K show that our method is helpful to improve the accuracy of semantic segmentation models and achieves the state-of-the-art performance. E.g. it boosts the benchmark model("PSPNet+ResNet18") by 7.50 dataset.

READ FULL TEXT

page 3

page 5

research
10/31/2019

Distilling Pixel-Wise Feature Similarities for Semantic Segmentation

Among the neural network compression techniques, knowledge distillation ...
research
03/11/2019

Structured Knowledge Distillation for Semantic Segmentation

In this paper, we investigate the knowledge distillation strategy for tr...
research
07/12/2022

Normalized Feature Distillation for Semantic Segmentation

As a promising approach in model compression, knowledge distillation imp...
research
03/12/2019

Knowledge Adaptation for Efficient Semantic Segmentation

Both accuracy and efficiency are of significant importance to the task o...
research
11/18/2021

Dynamically pruning segformer for efficient semantic segmentation

As one of the successful Transformer-based models in computer vision tas...
research
12/18/2021

Anomaly Discovery in Semantic Segmentation via Distillation Comparison Networks

This paper aims to address the problem of anomaly discovery in semantic ...
research
04/03/2019

A Comprehensive Overhaul of Feature Distillation

We investigate the design aspects of feature distillation methods achiev...

Please sign up or login with your details

Forgot password? Click here to reset