Instance-Conditional Knowledge Distillation for Object Detection

10/25/2021
by   Zijian Kang, et al.
0

Despite the success of Knowledge Distillation (KD) on image classification, it is still challenging to apply KD on object detection due to the difficulty in locating knowledge. In this paper, we propose an instance-conditional distillation framework to find desired knowledge. To locate knowledge of each instance, we use observed instances as condition information and formulate the retrieval process as an instance-conditional decoding process. Specifically, information of each instance that specifies a condition is encoded as query, and teacher's information is presented as key, we use the attention between query and key to measure the correlation, formulated by the transformer decoder. To guide this module, we further introduce an auxiliary task that directs to instance localization and identification, which are fundamental for detection. Extensive experiments demonstrate the efficacy of our method: we observe impressive improvements under various settings. Notably, we boost RetinaNet with ResNet-50 backbone from 37.4 to 40.7 mAP (+3.3) under 1x schedule, that even surpasses the teacher (40.4 mAP) with ResNet-101 backbone under 3x schedule. Code will be released soon.

READ FULL TEXT

page 5

page 10

research
03/03/2021

General Instance Distillation for Object Detection

In recent years, knowledge distillation has been proved to be an effecti...
research
11/15/2022

Knowledge Distillation for Detection Transformer with Consistent Distillation Points Sampling

DETR is a novel end-to-end transformer architecture object detector, whi...
research
10/27/2021

Beyond Classification: Knowledge Distillation using Multi-Object Impressions

Knowledge Distillation (KD) utilizes training data as a transfer set to ...
research
08/17/2023

Learning Lightweight Object Detectors via Multi-Teacher Progressive Distillation

Resource-constrained perception systems such as edge computing and visio...
research
07/20/2020

Interpretable Foreground Object Search As Knowledge Distillation

This paper proposes a knowledge distillation method for foreground objec...
research
04/03/2019

Correlation Congruence for Knowledge Distillation

Most teacher-student frameworks based on knowledge distillation (KD) dep...
research
04/01/2023

Q-DETR: An Efficient Low-Bit Quantized Detection Transformer

The recent detection transformer (DETR) has advanced object detection, b...

Please sign up or login with your details

Forgot password? Click here to reset