Visual-Inertial Mapping with Non-Linear Factor Recovery

04/13/2019
by   Vladyslav Usenko, et al.
0

Cameras and inertial measurement units are complementary sensors for ego-motion estimation and environment mapping. Their combination makes visual-inertial odometry (VIO) systems more accurate and robust. For globally consistent mapping, however, combining visual and inertial information is not straightforward. To estimate the motion and geometry with a set of images large baselines are required. Because of that, most systems operate on keyframes that have large time intervals between each other. Inertial data on the other hand quickly degrades with the duration of the intervals and after several seconds of integration, it typically contains only little useful information. In this paper, we propose to extract relevant information for visual-inertial mapping from visual-inertial odometry using non-linear factor recovery. We reconstruct a set of non-linear factors that make an optimal approximation of the information on the trajectory accumulated by VIO. To obtain a globally consistent map we combine these factors with loop-closing constraints using bundle adjustment. The VIO factors make the roll and pitch angles of the global map observable, and improve the robustness and the accuracy of the mapping. In experiments on a public benchmark, we demonstrate superior performance of our method over the state-of-the-art approaches.

READ FULL TEXT

page 3

page 6

research
02/07/2017

Keyframe-Based Visual-Inertial Online SLAM with Relocalization

Complementing images with inertial measurements has become one of the mo...
research
11/04/2019

Rolling-Shutter Modelling for Direct Visual-Inertial Odometry

We present a direct visual-inertial odometry (VIO) method which estimate...
research
09/10/2021

R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

In this letter, we propose a novel LiDAR-Inertial-Visual sensor fusion f...
research
04/22/2021

LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping

We propose a framework for tightly-coupled lidar-visual-inertial odometr...
research
01/18/2021

Deep Inertial Odometry with Accurate IMU Preintegration

Inertial Measurement Units (IMUs) are interceptive modalities that provi...
research
04/16/2018

Direct Sparse Visual-Inertial Odometry using Dynamic Marginalization

We present VI-DSO, a novel approach for visual-inertial odometry, which ...
research
10/13/2020

xVIO: A Range-Visual-Inertial Odometry Framework

xVIO is a range-visual-inertial odometry algorithm implemented at JPL. I...

Please sign up or login with your details

Forgot password? Click here to reset