Four years of multi-modal odometry and mapping on the rail vehicles

08/22/2023
by   Yusheng Wang, et al.
0

Precise, seamless, and efficient train localization as well as long-term railway environment monitoring is the essential property towards reliability, availability, maintainability, and safety (RAMS) engineering for railroad systems. Simultaneous localization and mapping (SLAM) is right at the core of solving the two problems concurrently. In this end, we propose a high-performance and versatile multi-modal framework in this paper, targeted for the odometry and mapping task for various rail vehicles. Our system is built atop an inertial-centric state estimator that tightly couples light detection and ranging (LiDAR), visual, optionally satellite navigation and map-based localization information with the convenience and extendibility of loosely coupled methods. The inertial sensors IMU and wheel encoder are treated as the primary sensor, which achieves the observations from subsystems to constrain the accelerometer and gyroscope biases. Compared to point-only LiDAR-inertial methods, our approach leverages more geometry information by introducing both track plane and electric power pillars into state estimation. The Visual-inertial subsystem also utilizes the environmental structure information by employing both lines and points. Besides, the method is capable of handling sensor failures by automatic reconfiguration bypassing failure modules. Our proposed method has been extensively tested in the long-during railway environments over four years, including general-speed, high-speed and metro, both passenger and freight traffic are investigated. Further, we aim to share, in an open way, the experience, problems, and successes of our group with the robotics community so that those that work in such environments can avoid these errors. In this view, we open source some of the datasets to benefit the research community.

READ FULL TEXT

page 2

page 11

page 12

page 16

page 17

page 19

page 22

page 23

research
11/01/2021

MetroLoc: Metro Vehicle Mapping and Localization with LiDAR-Camera-Inertial Integration

We propose an accurate and robust multi-modal sensor fusion framework, M...
research
04/30/2021

Super Odometry: IMU-centric LiDAR-Visual-Inertial Estimator for Challenging Environments

We propose Super Odometry, a high-precision multi-modal sensor fusion fr...
research
09/10/2021

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

In this letter, we propose a novel LiDAR-Inertial-Visual sensor fusion f...
research
12/16/2021

Rail Vehicle Localization and Mapping with LiDAR-Vision-Inertial-GNSS Fusion

In this paper, we present a global navigation satellite system (GNSS) ai...
research
07/15/2020

Learning Multiplicative Interactions with Bayesian Neural Networks for Visual-Inertial Odometry

This paper presents an end-to-end multi-modal learning approach for mono...
research
08/15/2023

Extended Preintegration for Relative State Estimation of Leader-Follower Platform

Relative state estimation using exteroceptive sensors suffers from limit...
research
11/28/2017

maplab: An Open Framework for Research in Visual-inertial Mapping and Localization

Robust and accurate visual-inertial estimation is crucial to many of tod...

Please sign up or login with your details

Forgot password? Click here to reset