Perception-aware Autonomous Mast Motion Planning for Planetary Exploration Rovers

12/14/2019
by   Jared Strader, et al.
0

Highly accurate real-time localization is of fundamental importance for the safety and efficiency of planetary rovers exploring the surface of Mars. Mars rover operations rely on vision-based systems to avoid hazards as well as plan safe routes. However, vision-based systems operate on the assumption that sufficient visual texture is visible in the scene. This poses a challenge for vision-based navigation on Mars where regions lacking visual texture are prevalent. To overcome this, we make use of the ability of the rover to actively steer the visual sensor to improve fault tolerance and maximize the perception performance. This paper answers the question of where and when to look by presenting a method for predicting the sensor trajectory that maximizes the localization performance of the rover. This is accomplished by an online assessment of possible trajectories using synthetic, future camera views created from previous observations of the scene. The proposed trajectories are quantified and chosen based on the expected localization performance. In this work, we validate the proposed method in field experiments at the Jet Propulsion Laboratory (JPL) Mars Yard. Furthermore, multiple performance metrics are identified and evaluated for reducing the overall runtime of the algorithm. We show how actively steering the perception system increases the localization accuracy compared to traditional fixed-sensor configurations.

READ FULL TEXT

page 2

page 10

page 14

page 15

page 17

page 19

page 20

research
03/02/2018

Magnetic-Visual Sensor Fusion-based Dense 3D Reconstruction and Localization for Endoscopic Capsule Robots

Reliable and real-time 3D reconstruction and localization functionality ...
research
03/24/2022

A Simulation Benchmark for Vision-based Autonomous Navigation

This work introduces a simulator benchmark for vision-based autonomous n...
research
11/05/2020

Learning Rolling Shutter Correction from Real Data without Camera Motion Assumption

The rolling shutter mechanism in modern cameras generates distortions as...
research
03/20/2019

LookUP: Vision-Only Real-Time Precise Underground Localisation for Autonomous Mining Vehicles

A key capability for autonomous underground mining vehicles is real-time...
research
07/27/2022

Proprioceptive Slip Detection for Planetary Rovers in Perceptually Degraded Extraterrestrial Environments

Slip detection is of fundamental importance for the safety and efficienc...
research
11/12/2022

Active View Planning for Visual SLAM in Outdoor Environments Based on Continuous Information Modeling

The visual simultaneous localization and mapping(vSLAM) is widely used i...
research
08/07/2020

Fisher Information Field: an Efficient and Differentiable Map for Perception-aware Planning

Considering visual localization accuracy at the planning time gives pref...

Please sign up or login with your details

Forgot password? Click here to reset