Adaptive Instance Distillation for Object Detection in Autonomous Driving

01/26/2022
by   Qizhen Lan, et al.
0

In recent years, knowledge distillation (KD) has been widely used as an effective way to derive efficient models. Through imitating a large teacher model, a lightweight student model can achieve comparable performance with more efficiency. However, most existing knowledge distillation methods are focused on classification tasks. Only a limited number of studies have applied knowledge distillation to object detection, especially in time-sensitive autonomous driving scenarios. We propose the Adaptive Instance Distillation (AID) method to selectively impart knowledge from the teacher to the student for improving the performance of knowledge distillation. Unlike previous KD methods that treat all instances equally, our AID can attentively adjust the distillation weights of instances based on the teacher model's prediction loss. We verified the effectiveness of our AID method through experiments on the KITTI and the COCO traffic datasets. The results show that our method improves the performance of existing state-of-the-art attention-guided and non-local distillation methods and achieves better distillation results on both single-stage and two-stage detectors. Compared to the baseline, our AID led to an average of 2.7 detectors, respectively. Furthermore, our AID is also shown to be useful for self-distillation to improve the teacher model's performance.

READ FULL TEXT

page 4

page 6

research
03/07/2023

Gradient-Guided Knowledge Distillation for Object Detectors

Deep learning models have demonstrated remarkable success in object dete...
research
03/03/2021

General Instance Distillation for Object Detection

In recent years, knowledge distillation has been proved to be an effecti...
research
05/23/2022

PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection

The remarkable breakthroughs in point cloud representation learning have...
research
09/23/2021

LGD: Label-guided Self-distillation for Object Detection

In this paper, we propose the first self-distillation framework for gene...
research
03/10/2022

Prediction-Guided Distillation for Dense Object Detection

Real-world object detection models should be cheap and accurate. Knowled...
research
03/22/2022

SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images

Skin cancer is one of the most common types of malignancy, affecting a l...
research
08/04/2020

Prime-Aware Adaptive Distillation

Knowledge distillation(KD) aims to improve the performance of a student ...

Please sign up or login with your details

Forgot password? Click here to reset