ImMesh: An Immediate LiDAR Localization and Meshing Framework

01/12/2023
by   Jiarong Lin, et al.
0

In this paper, we propose a novel LiDAR(-inertial) odometry and mapping framework to achieve the goal of simultaneous localization and meshing in real-time. This proposed framework termed ImMesh comprises four tightly-coupled modules: receiver, localization, meshing, and broadcaster. The localization module utilizes the prepossessed sensor data from the receiver, estimates the sensor pose online by registering LiDAR scans to maps, and dynamically grows the map. Then, our meshing module takes the registered LiDAR scan for incrementally reconstructing the triangle mesh on the fly. Finally, the real-time odometry, map, and mesh are published via our broadcaster. The key contribution of this work is the meshing module, which represents a scene by an efficient hierarchical voxels structure, performs fast finding of voxels observed by new scans, and reconstructs triangle facets in each voxel in an incremental manner. This voxel-wise meshing operation is delicately designed for the purpose of efficiency; it first performs a dimension reduction by projecting 3D points to a 2D local plane contained in the voxel, and then executes the meshing operation with pull, commit and push steps for incremental reconstruction of triangle facets. To the best of our knowledge, this is the first work in literature that can reconstruct online the triangle mesh of large-scale scenes, just relying on a standard CPU without GPU acceleration. To share our findings and make contributions to the community, we make our code publicly available on our GitHub: https://github.com/hku-mars/ImMesh.

READ FULL TEXT

page 1

page 4

page 8

page 13

page 14

page 16

page 18

page 21

research
09/08/2022

R^3LIVE++: A Robust, Real-time, Radiance reconstruction package with a tightly-coupled LiDAR-Inertial-Visual state Estimator

Simultaneous localization and mapping (SLAM) are crucial for autonomous ...
research
03/19/2023

NeRF-LOAM: Neural Implicit Representation for Large-Scale Incremental LiDAR Odometry and Mapping

Simultaneously odometry and mapping using LiDAR data is an important tas...
research
09/10/2021

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

In this letter, we propose a novel LiDAR-Inertial-Visual sensor fusion f...
research
06/30/2023

LIO-GVM: an Accurate, Tightly-Coupled Lidar-Inertial Odometry with Gaussian Voxel Map

This letter presents an accurate and robust Lidar Inertial Odometry fram...
research
08/05/2023

VoxelMap++: Mergeable Voxel Mapping Method for Online LiDAR(-inertial) Odometry

This paper presents VoxelMap++: a voxel mapping method with plane mergin...
research
10/25/2022

MICP-L: Fast parallel simulative Range Sensor to Mesh registration for Robot Localization

Triangle mesh-based maps have proven to be a powerful 3D representation ...

Please sign up or login with your details

Forgot password? Click here to reset