RaLiBEV: Radar and LiDAR BEV Fusion Learning for Anchor Box Free Object Detection System

11/11/2022
by   Yanlong Yang, et al.
0

Radar, the only sensor that could provide reliable perception capability in all weather conditions at an affordable cost, has been widely accepted as a key supplement to camera and LiDAR in modern advanced driver assistance systems (ADAS) and autonomous driving systems. Recent state-of-the-art works reveal that fusion of radar and LiDAR can lead to robust detection in adverse weather, such as fog. However, these methods still suffer from low accuracy of bounding box estimations. This paper proposes a bird's-eye view (BEV) fusion learning for an anchor box-free object detection system, which uses the feature derived from the radar range-azimuth heatmap and the LiDAR point cloud to estimate the possible objects. Different label assignment strategies have been designed to facilitate the consistency between the classification of foreground or background anchor points and the corresponding bounding box regressions. Furthermore, the performance of the proposed object detector can be further enhanced by employing a novel interactive transformer module. We demonstrated the superior performance of the proposed methods in this paper using the recently published Oxford Radar RobotCar (ORR) dataset. We showed that the accuracy of our system significantly outperforms the other state-of-the-art methods by a large margin.

READ FULL TEXT

page 1

page 4

page 5

page 6

page 8

research
06/16/2022

K-Radar: 4D Radar Object Detection Dataset and Benchmark for Autonomous Driving in Various Weather Conditions

Unlike RGB cameras that use visible light bands (384∼769 THz) and Lidar ...
research
10/18/2020

RADIATE: A Radar Dataset for Automotive Perception

Datasets for autonomous cars are essential for the development and bench...
research
02/28/2023

AdaptiveShape: Solving Shape Variability for 3D Object Detection with Geometry Aware Anchor Distributions

3D object detection with point clouds and images plays an important role...
research
04/30/2023

TransCAR: Transformer-based Camera-And-Radar Fusion for 3D Object Detection

Despite radar's popularity in the automotive industry, for fusion-based ...
research
05/25/2023

RC-BEVFusion: A Plug-In Module for Radar-Camera Bird's Eye View Feature Fusion

Radars and cameras belong to the most frequently used sensors for advanc...
research
03/08/2022

Pointillism: Accurate 3D bounding box estimation with multi-radars

Autonomous perception requires high-quality environment sensing in the f...
research
06/02/2023

Bi-LRFusion: Bi-Directional LiDAR-Radar Fusion for 3D Dynamic Object Detection

LiDAR and Radar are two complementary sensing approaches in that LiDAR s...

Please sign up or login with your details

Forgot password? Click here to reset