General Instance Distillation for Object Detection

03/03/2021
by   Xing Dai, et al.
0

In recent years, knowledge distillation has been proved to be an effective solution for model compression. This approach can make lightweight student models acquire the knowledge extracted from cumbersome teacher models. However, previous distillation methods of detection have weak generalization for different detection frameworks and rely heavily on ground truth (GT), ignoring the valuable relation information between instances. Thus, we propose a novel distillation method for detection tasks based on discriminative instances without considering the positive or negative distinguished by GT, which is called general instance distillation (GID). Our approach contains a general instance selection module (GISM) to make full use of feature-based, relation-based and response-based knowledge for distillation. Extensive results demonstrate that the student model achieves significant AP improvement and even outperforms the teacher in various detection frameworks. Specifically, RetinaNet with ResNet-50 achieves 39.1 surpasses the baseline 36.2 teacher model with 38.1

READ FULL TEXT

page 1

page 4

page 7

research
01/26/2022

Adaptive Instance Distillation for Object Detection in Autonomous Driving

In recent years, knowledge distillation (KD) has been widely used as an ...
research
02/11/2023

Dual Relation Knowledge Distillation for Object Detection

Knowledge distillation is an effective method for model compression. How...
research
10/25/2021

Instance-Conditional Knowledge Distillation for Object Detection

Despite the success of Knowledge Distillation (KD) on image classificati...
research
08/05/2022

Task-Balanced Distillation for Object Detection

Mainstream object detectors are commonly constituted of two sub-tasks, i...
research
04/28/2023

CORSD: Class-Oriented Relational Self Distillation

Knowledge distillation conducts an effective model compression method wh...
research
04/03/2019

Correlation Congruence for Knowledge Distillation

Most teacher-student frameworks based on knowledge distillation (KD) dep...
research
03/10/2022

Prediction-Guided Distillation for Dense Object Detection

Real-world object detection models should be cheap and accurate. Knowled...

Please sign up or login with your details

Forgot password? Click here to reset