Radar Voxel Fusion for 3D Object Detection

06/26/2021
by   Felix Nobis, et al.
8

Automotive traffic scenes are complex due to the variety of possible scenarios, objects, and weather conditions that need to be handled. In contrast to more constrained environments, such as automated underground trains, automotive perception systems cannot be tailored to a narrow field of specific tasks but must handle an ever-changing environment with unforeseen events. As currently no single sensor is able to reliably perceive all relevant activity in the surroundings, sensor data fusion is applied to perceive as much information as possible. Data fusion of different sensors and sensor modalities on a low abstraction level enables the compensation of sensor weaknesses and misdetections among the sensors before the information-rich sensor data are compressed and thereby information is lost after a sensor-individual object detection. This paper develops a low-level sensor fusion network for 3D object detection, which fuses lidar, camera, and radar data. The fusion network is trained and evaluated on the nuScenes data set. On the test set, fusion of radar data increases the resulting AP (Average Precision) detection score by about 5.1 proves especially beneficial in inclement conditions such as rain and night scenes. Fusing additional camera data contributes positively only in conjunction with the radar fusion, which shows that interdependencies of the sensors are important for the detection result. Additionally, the paper proposes a novel loss to handle the discontinuity of a simple yaw representation for object detection. Our updated loss increases the detection and orientation estimation performance for all sensor input configurations. The code for this research has been made available on GitHub.

READ FULL TEXT

page 2

page 4

page 7

page 9

page 10

page 12

page 13

page 15

research
05/15/2020

A Deep Learning-based Radar and Camera Sensor Fusion Architecture for Object Detection

Object detection in camera images, using deep learning has been proven s...
research
10/07/2020

YOdar: Uncertainty-based Sensor Fusion for Vehicle Detection with Camera and Radar Sensors

In this work, we present an uncertainty-based method for sensor fusion w...
research
09/20/2023

STARNet: Sensor Trustworthiness and Anomaly Recognition via Approximated Likelihood Regret for Robust Edge Autonomy

Complex sensors such as LiDAR, RADAR, and event cameras have proliferate...
research
07/06/2018

Optimal Sensor Data Fusion Architecture for Object Detection in Adverse Weather Conditions

A good and robust sensor data fusion in diverse weather conditions is a ...
research
02/01/2022

Cyber-resilience for marine navigation by information fusion and change detection

Cyber-resilience is an increasing concern in developing autonomous navig...
research
04/15/2020

HODET: Hybrid Object DEtection and Tracking using mmWave Radar and Visual Sensors

Image sensors have been explored heavily in automotive applications for ...
research
08/27/2021

Fast Rule-Based Clutter Detection in Automotive Radar Data

Automotive radar sensors output a lot of unwanted clutter or ghost detec...

Please sign up or login with your details

Forgot password? Click here to reset