Label Assignment Distillation for Object Detection

09/16/2021
by   Minghao Gao, et al.
0

Knowledge distillation methods are proved to be promising in improving the performance of neural networks and no additional computational expenses are required during the inference time. For the sake of boosting the accuracy of object detection, a great number of knowledge distillation methods have been proposed particularly designed for object detection. However, most of these methods only focus on feature-level distillation and label-level distillation, leaving the label assignment step, a unique and paramount procedure for object detection, by the wayside. In this work, we come up with a simple but effective knowledge distillation approach focusing on label assignment in object detection, in which the positive and negative samples of student network are selected in accordance with the predictions of teacher network. Our method shows encouraging results on the MSCOCO2017 benchmark, and can not only be applied to both one-stage detectors and two-stage detectors but also be utilized orthogonally with other knowledge distillation methods.

READ FULL TEXT

page 3

page 6

research
06/20/2019

GAN-Knowledge Distillation for one-stage Object Detection

Convolutional neural networks have a significant improvement in the accu...
research
11/17/2022

DETRDistill: A Universal Knowledge Distillation Framework for DETR-families

Transformer-based detectors (DETRs) have attracted great attention due t...
research
08/26/2022

Disentangle and Remerge: Interventional Knowledge Distillation for Few-Shot Object Detection from A Conditional Causal Perspective

Few-shot learning models learn representations with limited human annota...
research
08/24/2021

Improving Object Detection by Label Assignment Distillation

Label assignment in object detection aims to assign targets, foreground ...
research
03/04/2019

TKD: Temporal Knowledge Distillation for Active Perception

Deep neural networks based methods have been proved to achieve outstandi...
research
05/30/2022

Towards Efficient 3D Object Detection with Knowledge Distillation

Despite substantial progress in 3D object detection, advanced 3D detecto...
research
08/28/2023

Bridging Cross-task Protocol Inconsistency for Distillation in Dense Object Detection

Knowledge distillation (KD) has shown potential for learning compact mod...

Please sign up or login with your details

Forgot password? Click here to reset