A LiDAR-based real-time capable 3D Perception System for Automated Driving in Urban Domains

05/07/2020
by   Jens Rieken, et al.
0

We present a LiDAR-based and real-time capable 3D perception system for automated driving in urban domains. The hierarchical system design is able to model stationary and movable parts of the environment simultaneously and under real-time conditions. Our approach extends the state of the art by innovative in-detail enhancements for perceiving road users and drivable corridors even in case of non-flat ground surfaces and overhanging or protruding elements. We describe a runtime-efficient pointcloud processing pipeline, consisting of adaptive ground surface estimation, 3D clustering and motion classification stages. Based on the pipeline's output, the stationary environment is represented in a multi-feature mapping and fusion approach. Movable elements are represented in an object tracking system capable of using multiple reference points to account for viewpoint changes. We further enhance the tracking system by explicit consideration of occlusion and ambiguity cases. Our system is evaluated using a subset of the TUBS Road User Dataset. We enhance common performance metrics by considering application-driven aspects of real-world traffic scenarios. The perception system shows impressive results and is able to cope with the addressed scenarios while still preserving real-time capability.

READ FULL TEXT

page 4

page 8

research
01/08/2018

Towards Multi-Object Detection and Tracking in Urban Scenario under Uncertainties

Urban-oriented autonomous vehicles require a reliable perception technol...
research
05/15/2023

aUToLights: A Robust Multi-Camera Traffic Light Detection and Tracking System

Following four successful years in the SAE AutoDrive Challenge Series I,...
research
07/11/2019

Online Inference and Detection of Curbs in Partially Occluded Scenes with Sparse LIDAR

Road boundaries, or curbs, provide autonomous vehicles with essential in...
research
12/20/2018

Environment Perception Framework Fusing Multi-Object Tracking, Dynamic Occupancy Grid Maps and Digital Maps

Autonomously driving vehicles require a complete and robust perception o...
research
05/21/2020

RV-FuseNet: Range View based Fusion of Time-Series LiDAR Data for Joint 3D Object Detection and Motion Forecasting

Autonomous vehicles rely on robust real-time detection and future motion...
research
07/09/2020

Camera-Lidar Integration: Probabilistic sensor fusion for semantic mapping

An automated vehicle operating in an urban environment must be able to p...
research
09/19/2019

How to Evaluate Proving Grounds for Self-Driving? A Quantitative Approach

Proving ground has been a critical component in testing and validation f...

Please sign up or login with your details

Forgot password? Click here to reset