Distilling Image Classifiers in Object Detectors

06/09/2021
by   Shuxuan Guo, et al.
0

Knowledge distillation constitutes a simple yet effective way to improve the performance of a compact student network by exploiting the knowledge of a more powerful teacher. Nevertheless, the knowledge distillation literature remains limited to the scenario where the student and the teacher tackle the same task. Here, we investigate the problem of transferring knowledge not only across architectures but also across tasks. To this end, we study the case of object detection and, instead of following the standard detector-to-detector distillation approach, introduce a classifier-to-detector knowledge transfer framework. In particular, we propose strategies to exploit the classification teacher to improve both the detector's recognition accuracy and localization performance. Our experiments on several detectors with different backbones demonstrate the effectiveness of our approach, allowing us to outperform the state-of-the-art detector-to-detector distillation methods.

READ FULL TEXT
research
06/23/2020

Distilling Object Detectors with Task Adaptive Regularization

Current state-of-the-art object detectors are at the expense of high com...
research
03/26/2021

Distilling Object Detectors via Decoupled Features

Knowledge distillation is a widely used paradigm for inheriting informat...
research
09/20/2022

Rethinking Data Augmentation in Knowledge Distillation for Object Detection

Knowledge distillation (KD) has shown its effectiveness for object detec...
research
09/20/2019

Learning Lightweight Pedestrian Detector with Hierarchical Knowledge Distillation

It remains very challenging to build a pedestrian detection system for r...
research
03/27/2023

UniDistill: A Universal Cross-Modality Knowledge Distillation Framework for 3D Object Detection in Bird's-Eye View

In the field of 3D object detection for autonomous driving, the sensor p...
research
09/23/2021

LGD: Label-guided Self-distillation for Object Detection

In this paper, we propose the first self-distillation framework for gene...
research
05/30/2022

Knowledge Distillation for 6D Pose Estimation by Keypoint Distribution Alignment

Knowledge distillation facilitates the training of a compact student net...

Please sign up or login with your details

Forgot password? Click here to reset