R^3LIVE++: A Robust, Real-time, Radiance reconstruction package with a tightly-coupled LiDAR-Inertial-Visual state Estimator

09/08/2022
by   Jiarong Lin, et al.
0

Simultaneous localization and mapping (SLAM) are crucial for autonomous robots (e.g., self-driving cars, autonomous drones), 3D mapping systems, and AR/VR applications. This work proposed a novel LiDAR-inertial-visual fusion framework termed R^3LIVE++ to achieve robust and accurate state estimation while simultaneously reconstructing the radiance map on the fly. R^3LIVE++ consists of a LiDAR-inertial odometry (LIO) and a visual-inertial odometry (VIO), both running in real-time. The LIO subsystem utilizes the measurements from a LiDAR for reconstructing the geometric structure (i.e., the positions of 3D points), while the VIO subsystem simultaneously recovers the radiance information of the geometric structure from the input images. R^3LIVE++ is developed based on R^3LIVE and further improves the accuracy in localization and mapping by accounting for the camera photometric calibration (e.g., non-linear response function and lens vignetting) and the online estimation of camera exposure time. We conduct more extensive experiments on both public and our private datasets to compare our proposed system against other state-of-the-art SLAM systems. Quantitative and qualitative results show that our proposed system has significant improvements over others in both accuracy and robustness. In addition, to demonstrate the extendability of our work, we developed several applications based on our reconstructed radiance maps, such as high dynamic range (HDR) imaging, virtual environment exploration, and 3D video gaming. Lastly, to share our findings and make contributions to the community, we make our codes, hardware design, and dataset publicly available on our Github: github.com/hku-mars/r3live

READ FULL TEXT

page 1

page 13

page 14

page 15

page 19

page 20

page 21

page 22

research
09/10/2021

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

In this letter, we propose a novel LiDAR-Inertial-Visual sensor fusion f...
research
01/12/2023

ImMesh: An Immediate LiDAR Localization and Meshing Framework

In this paper, we propose a novel LiDAR(-inertial) odometry and mapping ...
research
07/03/2020

A decentralized framework for simultaneous calibration, localization and mapping with multiple LiDARs

LiDAR is playing a more and more essential role in autonomous driving ve...
research
03/24/2018

3D Reconstruction & Assessment Framework based on affordable 2D Lidar

Lidar is extensively used in the industry and mass market, due to its me...
research
08/15/2023

Extended Preintegration for Relative State Estimation of Leader-Follower Platform

Relative state estimation using exteroceptive sensors suffers from limit...
research
01/14/2022

SRVIO: Super Robust Visual Inertial Odometry for dynamic environments and challenging Loop-closure conditions

The visual localization or odometry problem is a well-known challenge in...
research
04/25/2023

Multi-Camera Visual-Inertial Simultaneous Localization and Mapping for Autonomous Valet Parking

Localization and mapping are key capabilities for self-driving vehicles....

Please sign up or login with your details

Forgot password? Click here to reset