Inter-Region Affinity Distillation for Road Marking Segmentation

04/11/2020
by   Yuenan Hou, et al.
7

We study the problem of distilling knowledge from a large deep teacher network to a much smaller student network for the task of road marking segmentation. In this work, we explore a novel knowledge distillation (KD) approach that can transfer 'knowledge' on scene structure more effectively from a teacher to a student model. Our method is known as Inter-Region Affinity KD (IntRA-KD). It decomposes a given road scene image into different regions and represents each region as a node in a graph. An inter-region affinity graph is then formed by establishing pairwise relationships between nodes based on their similarity in feature distribution. To learn structural knowledge from the teacher network, the student is required to match the graph generated by the teacher. The proposed method shows promising results on three large-scale road marking segmentation benchmarks, i.e., ApolloScape, CULane and LLAMAS, by taking various lightweight models as students and ResNet-101 as the teacher. IntRA-KD consistently brings higher performance gains on all lightweight models, compared to previous distillation methods. Our code is available at https://github.com/cardwing/Codes-for-IntRA-KD.

READ FULL TEXT

page 4

page 5

page 7

page 8

research
07/12/2022

Knowledge Condensation Distillation

Knowledge Distillation (KD) transfers the knowledge from a high-capacity...
research
08/08/2023

AICSD: Adaptive Inter-Class Similarity Distillation for Semantic Segmentation

In recent years, deep neural networks have achieved remarkable accuracy ...
research
05/21/2022

Knowledge Distillation from A Stronger Teacher

Unlike existing knowledge distillation methods focus on the baseline set...
research
09/03/2020

Intra-Utterance Similarity Preserving Knowledge Distillation for Audio Tagging

Knowledge Distillation (KD) is a popular area of research for reducing t...
research
06/05/2022

Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation

This article addresses the problem of distilling knowledge from a large ...
research
02/08/2022

Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation

Knowledge Distillation has shown very promising abil-ity in transferring...
research
09/06/2023

Knowledge Distillation Layer that Lets the Student Decide

Typical technique in knowledge distillation (KD) is regularizing the lea...

Please sign up or login with your details

Forgot password? Click here to reset