Gradient-Guided Knowledge Distillation for Object Detectors

03/07/2023
by   Qizhen Lan, et al.
0

Deep learning models have demonstrated remarkable success in object detection, yet their complexity and computational intensity pose a barrier to deploying them in real-world applications (e.g., self-driving perception). Knowledge Distillation (KD) is an effective way to derive efficient models. However, only a small number of KD methods tackle object detection. Also, most of them focus on mimicking the plain features of the teacher model but rarely consider how the features contribute to the final detection. In this paper, we propose a novel approach for knowledge distillation in object detection, named Gradient-guided Knowledge Distillation (GKD). Our GKD uses gradient information to identify and assign more weights to features that significantly impact the detection loss, allowing the student to learn the most relevant features from the teacher. Furthermore, we present bounding-box-aware multi-grained feature imitation (BMFI) to further improve the KD performance. Experiments on the KITTI and COCO-Traffic datasets demonstrate our method's efficacy in knowledge distillation for object detection. On one-stage and two-stage detectors, our GKD-BMFI leads to an average of 5.1 beating various state-of-the-art KD methods.

READ FULL TEXT

page 3

page 4

page 7

research
01/26/2022

Adaptive Instance Distillation for Object Detection in Autonomous Driving

In recent years, knowledge distillation (KD) has been widely used as an ...
research
03/10/2022

Prediction-Guided Distillation for Dense Object Detection

Real-world object detection models should be cheap and accurate. Knowled...
research
06/09/2019

Distilling Object Detectors with Fine-grained Feature Imitation

State-of-the-art CNN based recognition models are often computationally ...
research
08/03/2022

KD-SCFNet: Towards More Accurate and Efficient Salient Object Detection via Knowledge Distillation

Most existing salient object detection (SOD) models are difficult to app...
research
06/17/2021

Dynamic Knowledge Distillation with A Single Stream Structure for RGB-DSalient Object Detection

RGB-D salient object detection(SOD) demonstrates its superiority on dete...
research
03/26/2021

Hands-on Guidance for Distilling Object Detectors

Knowledge distillation can lead to deploy-friendly networks against the ...
research
01/31/2023

AMD: Adaptive Masked Distillation for Object

As a general model compression paradigm, feature-based knowledge distill...

Please sign up or login with your details

Forgot password? Click here to reset