Observability Analysis of Aided INS with Heterogeneous Features of Points, Lines and Planes

05/12/2018
by   Yulin Yang, et al.
0

In this paper, we perform a thorough observability analysis for linearized inertial navigation systems (INS) aided by exteroceptive range and/or bearing sensors (such as cameras, LiDAR and sonars) with different geometric features (points, lines and planes). While the observability of vision-aided INS (VINS) with point features has been extensively studied in the literature, we analytically show that the general aided INS with point features preserves the same observability property: that is, 4 unobservable directions, corresponding to the global yaw and the global position of the sensor platform. We further prove that there are at least 5 (and 7) unobservable directions for the linearized aided INS with a single line (and plane) feature; and, for the first time, analytically derive the unobservable subspace for the case of multiple lines/planes. Building upon this, we examine the system observability of the linearized aided INS with different combinations of points, lines and planes, and show that, in general, the system preserves at least 4 unobservable directions, while if global measurements are available, as expected, some unobservable directions diminish. In particular, when using plane features, we propose to use a minimal, closest point (CP) representation; and we also study in-depth the effects of 5 degenerate motions identified on observability. To numerically validate our analysis, we develop and evaluate both EKF-based visual-inertial SLAM and visual-inertial odometry (VIO) using heterogeneous geometric features in Monte Carlo simulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/17/2020

LIC-Fusion 2.0: LiDAR-Inertial-Camera Odometry with Sliding-Window Plane-Feature Tracking

Multi-sensor fusion of multi-modal measurements from commodity inertial,...
research
12/16/2021

Rail Vehicle Localization and Mapping with LiDAR-Vision-Inertial-GNSS Fusion

In this paper, we present a global navigation satellite system (GNSS) ai...
research
04/16/2020

Leveraging Planar Regularities for Point Line Visual-Inertial Odometry

With monocular Visual-Inertial Odometry (VIO) system, 3D point cloud and...
research
03/06/2018

PI-VIO: Robust and Efficient Stereo Visual Inertial Odometry using Points and Lines

In this paper, we present the PerceptIn Visual Inertial Odometry (PI-VIO...
research
02/01/2023

EMV-LIO: An Efficient Multiple Vision aided LiDAR-Inertial Odometry

To deal with the degeneration caused by the incomplete constraints of si...
research
02/01/2020

Bidirectional Trajectory Computation for Odometer-Aided Visual-Inertial SLAM

Odometer-aided visual-inertial SLAM systems typically have a good perfor...
research
10/13/2020

xVIO: A Range-Visual-Inertial Odometry Framework

xVIO is a range-visual-inertial odometry algorithm implemented at JPL. I...

Please sign up or login with your details

Forgot password? Click here to reset