AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation

08/08/2023
by   Amir M. Mansourian, et al.
0

In recent years, deep neural networks have achieved remarkable accuracy in computer vision tasks. With inference time being a crucial factor, particularly in dense prediction tasks such as semantic segmentation, knowledge distillation has emerged as a successful technique for improving the accuracy of lightweight student networks. The existing methods often neglect the information in channels and among different classes. To overcome these limitations, this paper proposes a novel method called Inter-Class Similarity Distillation (ICSD) for the purpose of knowledge distillation. The proposed method transfers high-order relations from the teacher network to the student network by independently computing intra-class distributions for each class from network outputs. This is followed by calculating inter-class similarity matrices for distillation using KL divergence between distributions of each pair of classes. To further improve the effectiveness of the proposed method, an Adaptive Loss Weighting (ALW) training strategy is proposed. Unlike existing methods, the ALW strategy gradually reduces the influence of the teacher network towards the end of training process to account for errors in teacher's predictions. Extensive experiments conducted on two well-known datasets for semantic segmentation, Cityscapes and Pascal VOC 2012, validate the effectiveness of the proposed method in terms of mIoU and pixel accuracy. The proposed method outperforms most of existing knowledge distillation methods as demonstrated by both quantitative and qualitative evaluations. Code is available at: https://github.com/AmirMansurian/AICSD

READ FULL TEXT

page 1

page 4

page 7

page 8

page 10

research
04/14/2022

Cross-Image Relational Knowledge Distillation for Semantic Segmentation

Current Knowledge Distillation (KD) methods for semantic segmentation of...
research
05/21/2022

Knowledge Distillation from A Stronger Teacher

Unlike existing knowledge distillation methods focus on the baseline set...
research
04/11/2020

Inter-Region Affinity Distillation for Road Marking Segmentation

We study the problem of distilling knowledge from a large deep teacher n...
research
03/12/2019

Knowledge Adaptation for Efficient Semantic Segmentation

Both accuracy and efficiency are of significant importance to the task o...
research
10/14/2022

Lightweight Alpha Matting Network Using Distillation-Based Channel Pruning

Recently, alpha matting has received a lot of attention because of its u...
research
02/08/2022

Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation

Knowledge Distillation has shown very promising abil-ity in transferring...
research
12/18/2021

Anomaly Discovery in Semantic Segmentation via Distillation Comparison Networks

This paper aims to address the problem of anomaly discovery in semantic ...

Please sign up or login with your details

Forgot password? Click here to reset