Physics and semantic informed multi-sensor calibration via optimization theory and self-supervised learning

06/06/2022
by   Shmuel Y. Hayoun, et al.
0

Achieving safe and reliable autonomous driving relies greatly on the ability to achieve an accurate and robust perception system; however, this cannot be fully realized without precisely calibrated sensors. Environmental and operational conditions as well as improper maintenance can produce calibration errors inhibiting sensor fusion and, consequently, degrading the perception performance. Traditionally, sensor calibration is performed in a controlled environment with one or more known targets. Such a procedure can only be carried out in between drives and requires manual operation; a tedious task if needed to be conducted on a regular basis. This sparked a recent interest in online targetless methods, capable of yielding a set of geometric transformations based on perceived environmental features, however, the required redundancy in sensing modalities makes this task even more challenging, as the features captured by each modality and their distinctiveness may vary. We present a holistic approach to performing joint calibration of a camera-lidar-radar trio. Leveraging prior knowledge and physical properties of these sensing modalities together with semantic information, we propose two targetless calibration methods within a cost minimization framework once via direct online optimization, and second via self-supervised learning (SSL).

READ FULL TEXT

page 2

page 6

page 8

page 11

page 16

page 19

page 26

page 27

research
05/27/2022

OpenCalib: A Multi-sensor Calibration Toolbox for Autonomous Driving

Accurate sensor calibration is a prerequisite for multi-sensor perceptio...
research
07/08/2022

SST-Calib: Simultaneous Spatial-Temporal Parameter Calibration between LIDAR and Camera

With information from multiple input modalities, sensor fusion-based alg...
research
08/16/2023

Self-Supervised Online Camera Calibration for Automated Driving and Parking Applications

Camera-based perception systems play a central role in modern autonomous...
research
03/22/2018

CalibNet: Self-Supervised Extrinsic Calibration using 3D Spatial Transformer Networks

3D LiDARs and 2D cameras are increasingly being used alongside each othe...
research
09/02/2023

Online Targetless Radar-Camera Extrinsic Calibration Based on the Common Features of Radar and Camera

Sensor fusion is essential for autonomous driving and autonomous robots,...
research
02/01/2023

Extrinsic Calibration of 2D mm-Wavelength Radar Pairs Using Ego-Velocity Estimates

Correct radar data fusion depends on knowledge of the spatial transform ...
research
09/21/2017

A Multimodal, Full-Surround Vehicular Testbed for Naturalistic Studies and Benchmarking: Design, Calibration and Deployment

Recent progress in autonomous and semi-autonomous driving has been made ...

Please sign up or login with your details

Forgot password? Click here to reset