Cross-Image Relational Knowledge Distillation for Semantic Segmentation

04/14/2022
by   Chuanguang Yang, et al.
35

Current Knowledge Distillation (KD) methods for semantic segmentation often guide the student to mimic the teacher's structured information generated from individual data samples. However, they ignore the global semantic relations among pixels across various images that are valuable for KD. This paper proposes a novel Cross-Image Relational KD (CIRKD), which focuses on transferring structured pixel-to-pixel and pixel-to-region relations among the whole images. The motivation is that a good teacher network could construct a well-structured feature space in terms of global pixel dependencies. CIRKD makes the student mimic better structured semantic relations from the teacher, thus improving the segmentation performance. Experimental results over Cityscapes, CamVid and Pascal VOC datasets demonstrate the effectiveness of our proposed approach against state-of-the-art distillation methods. The code is available at https://github.com/winycg/CIRKD.

READ FULL TEXT
research
03/15/2021

Refine Myself by Teaching Myself: Feature Refinement via Self-Knowledge Distillation

Knowledge distillation is a method of transferring the knowledge from a ...
research
08/08/2023

AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation

In recent years, deep neural networks have achieved remarkable accuracy ...
research
10/31/2019

Distilling Pixel-Wise Feature Similarities for Semantic Segmentation

Among the neural network compression techniques, knowledge distillation ...
research
12/18/2021

Anomaly Discovery in Semantic Segmentation via Distillation Comparison Networks

This paper aims to address the problem of anomaly discovery in semantic ...
research
05/21/2022

Knowledge Distillation from A Stronger Teacher

Unlike existing knowledge distillation methods focus on the baseline set...
research
08/12/2021

Distilling Holistic Knowledge with Graph Neural Networks

Knowledge Distillation (KD) aims at transferring knowledge from a larger...
research
12/06/2019

LaTeS: Latent Space Distillation for Teacher-Student Driving Policy Learning

We describe a policy learning approach to map visual inputs to driving c...

Please sign up or login with your details

Forgot password? Click here to reset