RadarGNN: Transformation Invariant Graph Neural Network for Radar-based Perception

04/13/2023
by   Felix Fent, et al.
0

A reliable perception has to be robust against challenging environmental conditions. Therefore, recent efforts focused on the use of radar sensors in addition to camera and lidar sensors for perception applications. However, the sparsity of radar point clouds and the poor data availability remain challenging for current perception methods. To address these challenges, a novel graph neural network is proposed that does not just use the information of the points themselves but also the relationships between the points. The model is designed to consider both point features and point-pair features, embedded in the edges of the graph. Furthermore, a general approach for achieving transformation invariance is proposed which is robust against unseen scenarios and also counteracts the limited data availability. The transformation invariance is achieved by an invariant data representation rather than an invariant model architecture, making it applicable to other methods. The proposed RadarGNN model outperforms all previous methods on the RadarScenes dataset. In addition, the effects of different invariances on the object detection and semantic segmentation quality are investigated. The code is made available as open-source software under https://github.com/TUMFTM/RadarGNN.

READ FULL TEXT
research
07/21/2023

HVDetFusion: A Simple and Robust Camera-Radar Fusion Framework

In the field of autonomous driving, 3D object detection is a very import...
research
08/20/2023

Efficient-VRNet: An Exquisite Fusion Network for Riverway Panoptic Perception based on Asymmetric Fair Fusion of Vision and 4D mmWave Radar

Panoptic perception is essential to unmanned surface vehicles (USVs) for...
research
05/22/2023

Semantic Segmentation of Radar Detections using Convolutions on Point Clouds

For autonomous driving, radar sensors provide superior reliability regar...
research
07/17/2023

ROFusion: Efficient Object Detection using Hybrid Point-wise Radar-Optical Fusion

Radars, due to their robustness to adverse weather conditions and abilit...
research
12/12/2022

ALSO: Automotive Lidar Self-supervision by Occupancy estimation

We propose a new self-supervised method for pre-training the backbone of...
research
05/12/2022

CorAl: Introspection for Robust Radar and Lidar Perception in Diverse Environments Using Differential Entropy

Robust perception is an essential component to enable long-term operatio...
research
03/16/2023

Tackling Clutter in Radar Data – Label Generation and Detection Using PointNet++

Radar sensors employed for environment perception, e.g. in autonomous ve...

Please sign up or login with your details

Forgot password? Click here to reset