Automatic Surround Camera Calibration Method in Road Scene for Self-driving Car

05/26/2023
by   Jixiang Li, et al.
0

With the development of autonomous driving technology, sensor calibration has become a key technology to achieve accurate perception fusion and localization. Accurate calibration of the sensors ensures that each sensor can function properly and accurate information aggregation can be achieved. Among them, camera calibration based on surround view has received extensive attention. In autonomous driving applications, the calibration accuracy of the camera can directly affect the accuracy of perception and depth estimation. For online calibration of surround-view cameras, traditional feature extraction-based methods will suffer from strong distortion when the initial extrinsic parameters error is large, making these methods less robust and inaccurate. More existing methods use the sparse direct method to calibrate multi-cameras, which can ensure both accuracy and real-time performance and is theoretically achievable. However, this method requires a better initial value, and the initial estimate with a large error is often stuck in a local optimum. To this end, we introduce a robust automatic multi-cameras (pinhole or fisheye cameras) calibration and refinement method in the road scene. We utilize the coarse-to-fine random-search strategy, and it can solve large disturbances of initial extrinsic parameters, which can make up for falling into optimal local value in nonlinear optimization methods. In the end, quantitative and qualitative experiments are conducted in actual and simulated environments, and the result shows the proposed method can achieve accuracy and robustness performance. The open-source code is available at https://github.com/OpenCalib/SurroundCameraCalib.

READ FULL TEXT

page 1

page 3

page 4

page 5

research
02/28/2022

Joint Camera Intrinsic and LiDAR-Camera Extrinsic Calibration

Sensor-based environmental perception is a crucial step for autonomous d...
research
03/30/2023

Online Camera-to-ground Calibration for Autonomous Driving

Online camera-to-ground calibration is to generate a non-rigid body tran...
research
01/12/2021

Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups

Most sensor setups for onboard autonomous perception are composed of LiD...
research
09/20/2021

BabelCalib: A Universal Approach to Calibrating Central Cameras

Existing calibration methods occasionally fail for large field-of-view c...
research
09/14/2018

Project AutoVision: Localization and 3D Scene Perception for an Autonomous Vehicle with a Multi-Camera System

Project AutoVision aims to develop localization and 3D scene perception ...
research
05/07/2023

Bi-Mapper: Holistic BEV Semantic Mapping for Autonomous Driving

A semantic map of the road scene, covering fundamental road elements, is...
research
03/07/2022

CROON: Automatic Multi-LiDAR Calibration and Refinement Method in Road Scene

Sensor-based environmental perception is a crucial part of the autonomou...

Please sign up or login with your details

Forgot password? Click here to reset