A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors

01/11/2019
by   Tong Qin, et al.
0

Nowadays, more and more sensors are equipped on robots to increase robustness and autonomous ability. We have seen various sensor suites equipped on different platforms, such as stereo cameras on ground vehicles, a monocular camera with an IMU (Inertial Measurement Unit) on mobile phones, and stereo cameras with an IMU on aerial robots. Although many algorithms for state estimation have been proposed in the past, they are usually applied to a single sensor or a specific sensor suite. Few of them can be employed with multiple sensor choices. In this paper, we proposed a general optimization-based framework for odometry estimation, which supports multiple sensor sets. Every sensor is treated as a general factor in our framework. Factors which share common state variables are summed together to build the optimization problem. We further demonstrate the generality with visual and inertial sensors, which form three sensor suites (stereo cameras, a monocular camera with an IMU, and stereo cameras with an IMU). We validate the performance of our system on public datasets and through real-world experiments with multiple sensors. Results are compared against other state-of-the-art algorithms. We highlight that our system is a general framework, which can easily fuse various sensors in a pose graph optimization. Our implementations are open source[https://github.com/HKUST-Aerial-Robotics/VINS-Fusion].

READ FULL TEXT

page 1

page 6

research
01/11/2019

A General Optimization-based Framework for Global Pose Estimation with Multiple Sensors

Accurate state estimation is a fundamental problem for autonomous robots...
research
03/15/2023

UMS-VINS: United Monocular-Stereo Features for Visual-Inertial Tightly Coupled Odometry

This paper introduces the united monocular-stereo features into a visual...
research
08/14/2017

Fast, Accurate Thin-Structure Obstacle Detection for Autonomous Mobile Robots

Safety is paramount for mobile robotic platforms such as self-driving ca...
research
02/01/2022

NTU VIRAL: A Visual-Inertial-Ranging-Lidar Dataset, From an Aerial Vehicle Viewpoint

In recent years, autonomous robots have become ubiquitous in research an...
research
12/05/2019

VersaVIS: An Open Versatile Multi-Camera Visual-Inertial Sensor Suite

Robust and accurate pose estimation is crucial for many applications in ...
research
04/04/2023

USTC FLICAR: A Multisensor Fusion Dataset of LiDAR-Inertial-Camera for Heavy-duty Autonomous Aerial Work Robots

In this paper, we present the USTC FLICAR Dataset, which is dedicated to...
research
06/12/2021

Lvio-Fusion: A Self-adaptive Multi-sensor Fusion SLAM Framework Using Actor-critic Method

State estimation with sensors is essential for mobile robots. Due to sen...

Please sign up or login with your details

Forgot password? Click here to reset