Deep learning for radar data exploitation of autonomous vehicle

03/15/2022
by   Arthur Ouaknine, et al.
0

Autonomous driving requires a detailed understanding of complex driving scenes. The redundancy and complementarity of the vehicle's sensors provide an accurate and robust comprehension of the environment, thereby increasing the level of performance and safety. This thesis focuses the on automotive RADAR, which is a low-cost active sensor measuring properties of surrounding objects, including their relative speed, and has the key advantage of not being impacted by adverse weather conditions. With the rapid progress of deep learning and the availability of public driving datasets, the perception ability of vision-based driving systems has considerably improved. The RADAR sensor is seldom used for scene understanding due to its poor angular resolution, the size, noise, and complexity of RADAR raw data as well as the lack of available datasets. This thesis proposes an extensive study of RADAR scene understanding, from the construction of an annotated dataset to the conception of adapted deep learning architectures. First, this thesis details approaches to tackle the current lack of data. A simple simulation as well as generative methods for creating annotated data will be presented. It will also describe the CARRADA dataset, composed of synchronised camera and RADAR data with a semi-automatic annotation method. This thesis then present a proposed set of deep learning architectures with their associated loss functions for RADAR semantic segmentation. It also introduces a method to open up research into the fusion of LiDAR and RADAR sensors for scene understanding. Finally, this thesis exposes a collaborative contribution, the RADIal dataset with synchronised High-Definition (HD) RADAR, LiDAR and camera. A deep learning architecture is also proposed to estimate the RADAR signal processing pipeline while performing multitask learning for object detection and free driving space segmentation.

READ FULL TEXT

page 1

page 25

page 34

research
03/30/2021

Multi-View Radar Semantic Segmentation

Understanding the scene around the ego-vehicle is key to assisted and au...
research
03/04/2021

PolarNet: Accelerated Deep Open Space Segmentation Using Automotive Radar in Polar Domain

Camera and Lidar processing have been revolutionized with the rapid deve...
research
12/20/2021

Raw High-Definition Radar for Multi-Task Learning

With their robustness to adverse weather conditions and ability to measu...
research
05/04/2020

CARRADA Dataset: Camera and Automotive Radar with Range-Angle-Doppler Annotations

High quality perception is essential for autonomous driving (AD) systems...
research
10/18/2018

Probably Unknown: Deep Inverse Sensor Modelling In Radar

Radar presents a promising alternative to lidar and vision in autonomous...
research
08/06/2020

A Sensitivity Analysis Approach for Evaluating a Radar Simulation for Virtual Testing of Autonomous Driving Functions

Simulation-based testing is a promising approach to significantly reduce...
research
06/01/2022

Towards Deep Radar Perception for Autonomous Driving: Datasets, Methods, and Challenges

With recent developments, the performance of automotive radar has improv...

Please sign up or login with your details

Forgot password? Click here to reset