Towards Efficient 3D Object Detection with Knowledge Distillation

05/30/2022
by   Jihan Yang, et al.
2

Despite substantial progress in 3D object detection, advanced 3D detectors often suffer from heavy computation overheads. To this end, we explore the potential of knowledge distillation (KD) for developing efficient 3D object detectors, focusing on popular pillar- and voxel-based detectors.Without well-developed teacher-student pairs, we first study how to obtain student models with good trade offs between accuracy and efficiency from the perspectives of model compression and input resolution reduction. Then, we build a benchmark to assess existing KD methods developed in the 2D domain for 3D object detection upon six well-constructed teacher-student pairs. Further, we propose an improved KD pipeline incorporating an enhanced logit KD method that performs KD on only a few pivotal positions determined by teacher classification response, and a teacher-guided student model initialization to facilitate transferring teacher model's feature extraction ability to students through weight inheritance. Finally, we conduct extensive experiments on the Waymo dataset. Our best performing model achieves 65.75% LEVEL 2 mAPH, surpassing its teacher model and requiring only 44% of teacher flops. Our most efficient model runs 51 FPS on an NVIDIA A100, which is 2.2× faster than PointPillar with even higher accuracy. Code will be available.

READ FULL TEXT
research
03/31/2021

Fixing the Teacher-Student Knowledge Discrepancy in Distillation

Training a small student network with the guidance of a larger teacher n...
research
09/16/2021

Label Assignment Distillation for Object Detection

Knowledge distillation methods are proved to be promising in improving t...
research
07/05/2022

PKD: General Distillation Framework for Object Detectors via Pearson Correlation Coefficient

Knowledge distillation(KD) is a widely-used technique to train compact m...
research
09/23/2021

LGD: Label-guided Self-distillation for Object Detection

In this paper, we propose the first self-distillation framework for gene...
research
08/20/2023

Representation Disparity-aware Distillation for 3D Object Detection

In this paper, we focus on developing knowledge distillation (KD) for co...
research
10/17/2022

Distilling Object Detectors With Global Knowledge

Knowledge distillation learns a lightweight student model that mimics a ...
research
12/31/2022

Guided Hybrid Quantization for Object detection in Multimodal Remote Sensing Imagery via One-to-one Self-teaching

Considering the computation complexity, we propose a Guided Hybrid Quant...

Please sign up or login with your details

Forgot password? Click here to reset