itKD: Interchange Transfer-based Knowledge Distillation for 3D Object Detection

05/31/2022
by   Hyeon Cho, et al.
0

Recently, point-cloud based 3D object detectors have achieved remarkable progress. However, most studies are limited to the development of deep learning architectures for improving only their accuracy. In this paper, we propose an autoencoder-style framework comprising channel-wise compression and decompression via interchange transfer for knowledge distillation. To learn the map-view feature of a teacher network, the features from a teacher and student network are independently passed through the shared autoencoder; here, we use a compressed representation loss that binds the channel-wised compression knowledge from both the networks as a kind of regularization. The decompressed features are transferred in opposite directions to reduce the gap in the interchange reconstructions. Lastly, we present an attentive head loss for matching the pivotal detection information drawn by the multi-head self-attention mechanism. Through extensive experiments, we verify that our method can learn the lightweight model that is well-aligned with the 3D point cloud detection task and we demonstrate its superiority using the well-known public datasets Waymo and nuScenes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2022

PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection

The remarkable breakthroughs in point cloud representation learning have...
research
06/25/2023

Feature Adversarial Distillation for Point Cloud Classification

Due to the point cloud's irregular and unordered geometry structure, con...
research
01/31/2023

AMD: Adaptive Masked Distillation for Object

As a general model compression paradigm, feature-based knowledge distill...
research
07/12/2022

HEAD: HEtero-Assists Distillation for Heterogeneous Object Detectors

Conventional knowledge distillation (KD) methods for object detection ma...
research
09/06/2023

DMKD: Improving Feature-based Knowledge Distillation for Object Detection Via Dual Masking Augmentation

Recent mainstream masked distillation methods function by reconstructing...
research
06/20/2022

Knowledge Distillation for Oriented Object Detection on Aerial Images

Deep convolutional neural network with increased number of parameters ha...
research
05/03/2021

Initialization and Regularization of Factorized Neural Layers

Factorized layers–operations parameterized by products of two or more ma...

Please sign up or login with your details

Forgot password? Click here to reset