Prediction-Guided Distillation for Dense Object Detection

03/10/2022
by   Chenhongyi Yang, et al.
0

Real-world object detection models should be cheap and accurate. Knowledge distillation (KD) can boost the accuracy of a small, cheap detection model by leveraging useful information from a larger teacher model. However, a key challenge is identifying the most informative features produced by the teacher for distillation. In this work, we show that only a very small fraction of features within a ground-truth bounding box are responsible for a teacher's high detection performance. Based on this, we propose Prediction-Guided Distillation (PGD), which focuses distillation on these key predictive regions of the teacher and yields considerable gains in performance over many existing KD baselines. In addition, we propose an adaptive weighting scheme over the key regions to smooth out their influence and achieve even better performance. Our proposed approach outperforms current state-of-the-art KD baselines on a variety of advanced one-stage detection architectures. Specifically, on the COCO dataset, our method achieves between +3.1 ResNet-101 and ResNet-50 as the teacher and student backbones, respectively. On the CrowdHuman dataset, we achieve +3.2 also using these backbones. Our code is available at https://github.com/ChenhongyiYang/PGD.

READ FULL TEXT

page 2

page 5

page 7

page 14

research
03/07/2023

Gradient-Guided Knowledge Distillation for Object Detectors

Deep learning models have demonstrated remarkable success in object dete...
research
01/26/2022

Adaptive Instance Distillation for Object Detection in Autonomous Driving

In recent years, knowledge distillation (KD) has been widely used as an ...
research
03/03/2021

General Instance Distillation for Object Detection

In recent years, knowledge distillation has been proved to be an effecti...
research
11/14/2021

Robust and Accurate Object Detection via Self-Knowledge Distillation

Object detection has achieved promising performance on clean datasets, b...
research
09/06/2023

DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation

Recent mainstream masked distillation methods function by reconstructing...
research
08/24/2021

Improving Object Detection by Label Assignment Distillation

Label assignment in object detection aims to assign targets, foreground ...
research
06/04/2021

Churn Reduction via Distillation

In real-world systems, models are frequently updated as more data become...

Please sign up or login with your details

Forgot password? Click here to reset