Dual Relation Knowledge Distillation for Object Detection

02/11/2023
by   Zhenliang Ni, et al.
0

Knowledge distillation is an effective method for model compression. However, it is still a challenging topic to apply knowledge distillation to detection tasks. There are two key points resulting poor distillation performance for detection tasks. One is the serious imbalance between foreground and background features, another one is that small object lacks enough feature representation. To solve the above issues, we propose a new distillation method named dual relation knowledge distillation (DRKD), including pixel-wise relation distillation and instance-wise relation distillation.The pixel-wise relation distillation embeds pixel-wise features in the graph space and applies graph convolution to capture the global pixel relation. By distilling the global pixel relation, the student detector can learn the relation between foreground and background features, avoid the difficulty of distilling feature directly for feature imbalance issue.Besides, we find that instance-wise relation supplements valuable knowledge beyond independent features for small objects. Thus, the instance-wise relation distillation is designed, which calculates the similarity of different instances to obtain a relation matrix. More importantly, a relation filter module is designed to highlight valuable instance relations.The proposed dual relation knowledge distillation is general and can be easily applied for both one-stage and two-stage detectors. Our method achieves state-of-the-art performance, which improves Faster R-CNN based on ResNet50 from 38.4% to 41.6% mAP and improves RetinaNet based on ResNet50 from 37.4

READ FULL TEXT

page 1

page 2

page 3

research
03/03/2021

General Instance Distillation for Object Detection

In recent years, knowledge distillation has been proved to be an effecti...
research
07/20/2020

Interpretable Foreground Object Search As Knowledge Distillation

This paper proposes a knowledge distillation method for foreground objec...
research
06/21/2021

Structured Sparse R-CNN for Direct Scene Graph Generation

Scene graph generation (SGG) is to detect entity pairs with their relati...
research
03/11/2019

Structured Knowledge Distillation for Semantic Segmentation

In this paper, we investigate the knowledge distillation strategy for tr...
research
09/06/2023

DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation

Recent mainstream masked distillation methods function by reconstructing...
research
09/27/2021

Deep Structured Instance Graph for Distilling Object Detectors

Effectively structuring deep knowledge plays a pivotal role in transfer ...
research
11/23/2022

Structural Knowledge Distillation for Object Detection

Knowledge Distillation (KD) is a well-known training paradigm in deep ne...

Please sign up or login with your details

Forgot password? Click here to reset