Visual-Thermal Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments

03/05/2019
by   Shehryar Khattak, et al.
6

With an ever-widening domain of aerial robotic applications, including many mission critical tasks such as disaster response operations, search and rescue missions and infrastructure inspections taking place in GPS-denied environments, the need for reliable autonomous operation of aerial robots has become crucial. Operating in GPS-denied areas aerial robots rely on a multitude of sensors to localize and navigate. Visible spectrum cameras are the most commonly used sensors due to their low cost and weight. However, in environments that are visually-degraded such as in conditions of poor illumination, low texture, or presence of obscurants including fog, smoke and dust, the reliability of visible light cameras deteriorates significantly. Nevertheless, maintaining reliable robot navigation in such conditions is essential. In contrast to visible light cameras, thermal cameras offer visibility in the infrared spectrum and can be used in a complementary manner with visible spectrum cameras for robot localization and navigation tasks, without paying the significant weight and power penalty typically associated with carrying other sensors. Exploiting this fact, in this work we present a multi-sensor fusion algorithm for reliable odometry estimation in GPS-denied and degraded visual environments. The proposed method utilizes information from both the visible and thermal spectra for landmark selection and prioritizes feature extraction from informative image regions based on a metric over spatial entropy. Furthermore, inertial sensing cues are integrated to improve the robustness of the odometry estimation process. To verify our solution, a set of challenging experiments were conducted inside a) an obscurant filed machine shop-like industrial environment, as well as b) a dark subterranean mine in the presence of heavy airborne dust.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 8

research
03/02/2019

Marker based Thermal-Inertial Localization for Aerial Robots in Obscurant Filled Environments

For robotic inspection tasks in known environments fiducial markers prov...
research
03/03/2019

Keyframe-based Direct Thermal-Inertial Odometry

This paper proposes an approach for fusing direct radiometric data from ...
research
10/13/2020

xVIO: A Range-Visual-Inertial Odometry Framework

xVIO is a range-visual-inertial odometry algorithm implemented at JPL. I...
research
08/27/2019

Improving Visual Feature Extraction in Glacial Environments

Glacial science could benefit tremendously from autonomous robots, but p...
research
03/05/2019

Vision-Depth Landmarks and Inertial Fusion for Navigation in Degraded Visual Environments

This paper proposes a method for tight fusion of visual, depth and inert...
research
05/29/2023

Active Collaborative Localization in Heterogeneous Robot Teams

Accurate and robust state estimation is critical for autonomous navigati...

Please sign up or login with your details

Forgot password? Click here to reset