Robust Multi-Modal Multi-LiDAR-Inertial Odometry and Mapping for Indoor Environments

03/05/2023
by   Li Qingqing, et al.
0

Integrating multiple LiDAR sensors can significantly enhance a robot's perception of the environment, enabling it to capture adequate measurements for simultaneous localization and mapping (SLAM). Indeed, solid-state LiDARs can bring in high resolution at a low cost to traditional spinning LiDARs in robotic applications. However, their reduced field of view (FoV) limits performance, particularly indoors. In this paper, we propose a tightly-coupled multi-modal multi-LiDAR-inertial SLAM system for surveying and mapping tasks. By taking advantage of both solid-state and spinnings LiDARs, and built-in inertial measurement units (IMU), we achieve both robust and low-drift ego-estimation as well as high-resolution maps in diverse challenging indoor environments (e.g., small, featureless rooms). First, we use spatial-temporal calibration modules to align the timestamp and calibrate extrinsic parameters between sensors. Then, we extract two groups of feature points including edge and plane points, from LiDAR data. Next, with pre-integrated IMU data, an undistortion module is applied to the LiDAR point cloud data. Finally, the undistorted point clouds are merged into one point cloud and processed with a sliding window based optimization module. From extensive experiment results, our method shows competitive performance with state-of-the-art spinning-LiDAR-only or solid-state-LiDAR-only SLAM systems in diverse environments. More results, code, and dataset can be found at \href{https://github.com/TIERS/multi-modal-loam}{https://github.com/TIERS/multi-modal-loam}.

READ FULL TEXT

page 1

page 3

page 6

page 7

research
03/07/2022

Multi-Modal Lidar Dataset for Benchmarking General-Purpose Localization and Mapping Algorithms

Lidar technology has evolved significantly over the last decade, with hi...
research
07/13/2023

FF-LINS: A Consistent Frame-to-Frame Solid-State-LiDAR-Inertial State Estimator

Most of the existing LiDAR-inertial navigation systems are based on fram...
research
06/14/2023

Challenges of Indoor SLAM: A multi-modal multi-floor dataset for SLAM evaluation

Robustness in Simultaneous Localization and Mapping (SLAM) remains one o...
research
06/03/2022

OdomBeyondVision: An Indoor Multi-modal Multi-platform Odometry Dataset Beyond the Visible Spectrum

This paper presents a multimodal indoor odometry dataset, OdomBeyondVisi...
research
10/27/2020

Robust Odometry and Mapping for Multi-LiDAR Systems with Online Extrinsic Calibration

Combining multiple LiDARs enables a robot to maximize its perceptual awa...
research
10/03/2022

A Benchmark for Multi-Modal Lidar SLAM with Ground Truth in GNSS-Denied Environments

Lidar-based simultaneous localization and mapping (SLAM) approaches have...
research
07/14/2022

Challenges of SLAM in extremely unstructured environments: the DLR Planetary Stereo, Solid-State LiDAR, Inertial Dataset

We present the DLR Planetary Stereo, Solid-State LiDAR, Inertial (S3LI) ...

Please sign up or login with your details

Forgot password? Click here to reset