SMURF: Spatial Multi-Representation Fusion for 3D Object Detection with 4D Imaging Radar

07/20/2023
by   Jianan Liu, et al.
0

The 4D Millimeter wave (mmWave) radar is a promising technology for vehicle sensing due to its cost-effectiveness and operability in adverse weather conditions. However, the adoption of this technology has been hindered by sparsity and noise issues in radar point cloud data. This paper introduces spatial multi-representation fusion (SMURF), a novel approach to 3D object detection using a single 4D imaging radar. SMURF leverages multiple representations of radar detection points, including pillarization and density features of a multi-dimensional Gaussian mixture distribution through kernel density estimation (KDE). KDE effectively mitigates measurement inaccuracy caused by limited angular resolution and multi-path propagation of radar signals. Additionally, KDE helps alleviate point cloud sparsity by capturing density features. Experimental evaluations on View-of-Delft (VoD) and TJ4DRadSet datasets demonstrate the effectiveness and generalization ability of SMURF, outperforming recently proposed 4D imaging radar-based single-representation models. Moreover, while using 4D imaging radar only, SMURF still achieves comparable performance to the state-of-the-art 4D imaging radar and camera fusion-based method, with an increase of 1.22 average precision on bird's-eye view of TJ4DRadSet dataset and 1.32 mean average precision on the entire annotated area of VoD dataset. Our proposed method demonstrates impressive inference time and addresses the challenges of real-time detection, with the inference time no more than 0.05 seconds for most scans on both datasets. This research highlights the benefits of 4D mmWave radar and is a strong benchmark for subsequent works regarding 3D object detection with 4D imaging radar.

READ FULL TEXT

page 1

page 2

page 5

page 6

page 9

page 10

research
08/25/2022

Bridging the View Disparity of Radar and Camera Features for Multi-modal Fusion 3D Object Detection

Environmental perception with multi-modal fusion of radar and camera is ...
research
09/07/2023

ClusterFusion: Leveraging Radar Spatial Features for Radar-Camera 3D Object Detection in Autonomous Vehicles

Thanks to the complementary nature of millimeter wave radar and camera, ...
research
05/25/2023

RC-BEVFusion: A Plug-In Module for Radar-Camera Bird's Eye View Feature Fusion

Radars and cameras belong to the most frequently used sensors for advanc...
research
07/21/2022

R2P: A Deep Learning Model from mmWave Radar to Point Cloud

Recent research has shown the effectiveness of mmWave radar sensing for ...
research
09/18/2023

Moving Object Detection and Tracking with 4D Radar Point Cloud

Mobile autonomy relies on the precise perception of dynamic environments...
research
03/11/2023

Enhanced K-Radar: Optimal Density Reduction to Improve Detection Performance and Accessibility of 4D Radar Tensor-based Object Detection

Recent works have shown the superior robustness of four-dimensional (4D)...
research
03/05/2020

mmFall: Fall Detection using 4D MmWave Radar and Variational Recurrent Autoencoder

Elderly fall prevention and detection is extremely crucial especially wi...

Please sign up or login with your details

Forgot password? Click here to reset