Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups

01/12/2021 ∙ by Jorge Beltrán, et al. ∙ 10

Most sensor setups for onboard autonomous perception are composed of LiDARs and vision systems, as they provide complementary information that improves the reliability of the different algorithms necessary to obtain a robust scene understanding. However, the effective use of information from different sources requires an accurate calibration between the sensors involved, which usually implies a tedious and burdensome process. We present a method to calibrate the extrinsic parameters of any pair of sensors involving LiDARs, monocular or stereo cameras, of the same or different modalities. The procedure is composed of two stages: first, reference points belonging to a custom calibration target are extracted from the data provided by the sensors to be calibrated, and second, the optimal rigid transformation is found through the registration of both point sets. The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups. In order to assess the performance of the proposed method, a novel evaluation suite built on top of a popular simulation framework is introduced. Experiments on the synthetic environment show that our calibration algorithm significantly outperforms existing methods, whereas real data tests corroborate the results obtained in the evaluation suite. Open-source code is available at



There are no comments yet.


page 1

page 3

page 4

page 9

page 11

page 12

Code Repositories


Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups. ROS Package.

view repo


Repository including Gazebo models, plugins and worlds to test calibration algorithms for Lidar-camera setups.

view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.