Teacher-Student Knowledge Distillation for Radar Perception on Embedded Accelerators

03/14/2023
by   Steven Shaw, et al.
0

Many radar signal processing methodologies are being developed for critical road safety perception tasks. Unfortunately, these signal processing algorithms are often poorly suited to run on embedded hardware accelerators used in automobiles. Conversely, end-to-end machine learning (ML) approaches better exploit the performance gains brought by specialized accelerators. In this paper, we propose a teacher-student knowledge distillation approach for low-level radar perception tasks. We utilize a hybrid model for stationary object detection as a teacher to train an end-to-end ML student model. The student can efficiently harness embedded compute for real-time deployment. We demonstrate that the proposed student model runs at speeds 100x faster than the teacher model.

READ FULL TEXT
research
03/09/2023

Learning the Wrong Lessons: Inserting Trojans During Knowledge Distillation

In recent years, knowledge distillation has become a cornerstone of effi...
research
10/31/2021

Rethinking the Knowledge Distillation From the Perspective of Model Calibration

Recent years have witnessed dramatically improvements in the knowledge d...
research
10/21/2021

Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression

Knowledge distillation (KD) is an effective model compression technique ...
research
11/30/2022

Hint-dynamic Knowledge Distillation

Knowledge Distillation (KD) transfers the knowledge from a high-capacity...
research
09/03/2019

Knowledge Distillation for End-to-End Person Search

We introduce knowledge distillation for end-to-end person search. End-to...
research
08/01/2023

DriveAdapter: Breaking the Coupling Barrier of Perception and Planning in End-to-End Autonomous Driving

End-to-end autonomous driving aims to build a fully differentiable syste...

Please sign up or login with your details

Forgot password? Click here to reset