VIRAL-Fusion: A Visual-Inertial-Ranging-Lidar Sensor Fusion Approach

10/23/2020
by   Thien-Minh Nguyen, et al.
0

In recent years, Onboard Self Localization (OSL) methods based on cameras or Lidar have achieved many significant progresses. However, some issues such as estimation drift and feature-dependence still remain inherent limitations. On the other hand, infrastructure-based methods can generally overcome these issues, but at the expense of some installation cost. This poses an interesting problem of how to effectively combine these methods, so as to achieve localization with long-term consistency as well as flexibility compared to any single method. To this end, we propose a comprehensive optimization-based estimator for 15-dimensional state of an Unmanned Aerial Vehicle (UAV), fusing data from an extensive set of sensors: inertial measurement units (IMUs), Ultra-Wideband (UWB) ranging sensors, and multiple onboard Visual-Inertial and Lidar odometry subsystems. In essence, a sliding window is used to formulate a sequence of robot poses, where relative rotational and translational constraints between these poses are observed in the IMU preintegration and OSL observations, while orientation and position are coupled in body-offset UWB range observations. An optimization-based approach is developed to estimate the trajectory of the robot in this sliding window. We evaluate the performance of the proposed scheme in multiple scenarios, including experiments on public datasets, high-fidelity graphical-physical simulator, and field-collected data from UAV flight tests. The result demonstrates that our integrated localization method can effectively resolve the drift issue, while incurring minimal installation requirements.

READ FULL TEXT

page 1

page 12

page 14

research
10/25/2020

LIRO: Tightly Coupled Lidar-Inertia-Ranging Odometry

In recent years, thanks to the continuously reduced cost and weight of 3...
research
06/30/2023

Fusion of Visual-Inertial Odometry with LiDAR Relative Localization for Cooperative Guidance of a Micro-Scale Aerial Vehicle

A novel relative localization approach for cooperative guidance of a mic...
research
05/07/2021

VIRAL SLAM: Tightly Coupled Camera-IMU-UWB-Lidar SLAM

In this paper, we propose a tightly-coupled, multi-modal simultaneous lo...
research
04/24/2021

MILIOM: Tightly Coupled Multi-Input Lidar-Inertia Odometry and Mapping

In this paper we investigate a tightly coupled Lidar-Inertia Odometry an...
research
07/17/2022

FEJ-VIRO: A Consistent First-Estimate Jacobian Visual-Inertial-Ranging Odometry

In recent years, Visual-Inertial Odometry (VIO) has achieved many signif...
research
08/01/2023

UVIO: An UWB-Aided Visual-Inertial Odometry Framework with Bias-Compensated Anchors Initialization

This paper introduces UVIO, a multi-sensor framework that leverages Ultr...
research
02/01/2022

NTU VIRAL: A Visual-Inertial-Ranging-Lidar Dataset, From an Aerial Vehicle Viewpoint

In recent years, autonomous robots have become ubiquitous in research an...

Please sign up or login with your details

Forgot password? Click here to reset