Robust tightly-coupled pose estimation based on monocular vision, inertia and wheel speed

03/03/2020
by   Peng Gang, et al.
0

The Visual SLAM method is widely used in self localization and mapping in complex environments. Visual-Inertia SLAM which combines camera with IMU can significantly improve the robustness and make scale week-visible while Monocular Visual SLAM is scale-invisible. For the ground mobile robot, the introduction of the wheel speed sensor can solve the scale weak-visible problem and improve the robustness under abnormal conditions. In this thesis, a multi-sensor fusion SLAM algorithm using monocular vision, inertial and wheel speed measurement is proposed. The sensor measurements are combined in a tightly-coupled manner, and the nonlinear optimization method is used to maximize the posterior probability to solve the optimal state estimation. Loop detection and back-end optimization are added to help reduce or even eliminate the cumulative error of estimated poses, ensuring global consistency of trajectory and map. The main research results include: wheel odometer pre-integration algorithm which combines chassis speed and IMU angular speed can avoid repeated integration caused by linearization point changes during iterative optimization; the state initialization based on the wheel odometer and IMU makes it possible to quickly and reliably calculate the initial state values required by the state estimator in both stationary and moving states. Comparative experiments were carried out in room scale scenes, building scale scenes, and visual loss. The results show that proposed algorithm has high accuracy, 2.2 m of cumulative error after the motion of 812 meters (0.28 loopback optimization disabled); strong robustness, effectively localization even in the case of sensor loss such as visual loss, etc. Accuracy and robustness of proposed method are superior to Monocular Visual Inertia SLAM and traditional wheel odometer.

READ FULL TEXT

page 14

page 15

page 17

research
03/22/2020

Monocular visual-inertial SLAM algorithm combined with wheel speed anomaly detection

To address the weak observability of monocular visual-inertial odometers...
research
04/13/2018

Tightly-coupled Monocular Visual-odometric SLAM using Wheels and a MEMS Gyroscope

In this paper, we present a novel tightly-coupled probabilistic monocula...
research
10/02/2018

Fusion of Monocular Vision and Radio-based Ranging for Global Scale Estimation and Drift Mitigation

Monocular vision-based Simultaneous Localization and Mapping (SLAM) is u...
research
10/06/2021

InterpolationSLAM: A Novel Robust Visual SLAM System in Rotating Scenes

In recent years, visual SLAM has achieved great progress and development...
research
04/27/2020

EAO-SLAM: Monocular Semi-Dense Object SLAM Based on Ensemble Data Association

Object-level data association and pose estimation play a fundamental rol...
research
09/15/2023

AVM-SLAM: Semantic Visual SLAM with Multi-Sensor Fusion in a Bird's Eye View for Automated Valet Parking

Automated Valet Parking (AVP) requires precise localization in challengi...
research
03/04/2020

Redesigning SLAM for Arbitrary Multi-Camera Systems

Adding more cameras to SLAM systems improves robustness and accuracy but...

Please sign up or login with your details

Forgot password? Click here to reset