R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

09/10/2021
by   Jiarong Lin, et al.
12

In this letter, we propose a novel LiDAR-Inertial-Visual sensor fusion framework termed R3LIVE, which takes advantage of measurement of LiDAR, inertial, and visual sensors to achieve robust and accurate state estimation. R3LIVE is contained of two subsystems, the LiDAR-inertial odometry (LIO) and visual-inertial odometry (VIO). The LIO subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial sensors and builds the geometry structure of (i.e. the position of 3D points) global maps. The VIO subsystem utilizes the data of visual-inertial sensors and renders the map's texture (i.e. the color of 3D points). More specifically, the VIO subsystem fuses the visual data directly and effectively by minimizing the frame-to-map photometric error. The developed system R3LIVE is developed based on our previous work R2LIVE, with careful architecture design and implementation. Experiment results show that the resultant system achieves more robustness and higher accuracy in state estimation than current counterparts (see our attached video). R3LIVE is a versatile and well-engineered system toward various possible applications, which can not only serve as a SLAM system for real-time robotic applications, but can also reconstruct the dense, precise, RGB-colored 3D maps for applications like surveying and mapping. Moreover, to make R3LIVE more extensible, we develop a series of offline utilities for reconstructing and texturing meshes, which further minimizes the gap between R3LIVE and various of 3D applications such as simulators, video games and etc (see our demos video). To share our findings and make contributions to the community, we open source R3LIVE on our Github, including all of our codes, software utilities, and the mechanical design of our device.

READ FULL TEXT

page 1

page 4

page 5

page 6

page 7

page 8

research
02/24/2021

R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and mapping

In this letter, we propose a robust, real-time tightly-coupled multi-sen...
research
09/08/2022

R^3LIVE++: A Robust, Real-time, Radiance reconstruction package with a tightly-coupled LiDAR-Inertial-Visual state Estimator

Simultaneous localization and mapping (SLAM) are crucial for autonomous ...
research
03/02/2022

FAST-LIVO: Fast and Tightly-coupled Sparse-Direct LiDAR-Inertial-Visual Odometry

To achieve accurate and robust pose estimation in Simultaneous Localizat...
research
08/22/2023

Four years of multi-modal odometry and mapping on the rail vehicles

Precise, seamless, and efficient train localization as well as long-term...
research
01/12/2023

ImMesh: An Immediate LiDAR Localization and Meshing Framework

In this paper, we propose a novel LiDAR(-inertial) odometry and mapping ...
research
04/13/2019

Visual-Inertial Mapping with Non-Linear Factor Recovery

Cameras and inertial measurement units are complementary sensors for ego...
research
07/06/2021

Best Axes Composition: Multiple Gyroscopes IMU Sensor Fusion to Reduce Systematic Error

In this paper, we propose an algorithm to combine multiple cheap Inertia...

Please sign up or login with your details

Forgot password? Click here to reset