Perception-aware Autonomous Mast Motion Planning for Planetary Exploration Rovers
Highly accurate real-time localization is of fundamental importance for the safety and efficiency of planetary rovers exploring the surface of Mars. Mars rover operations rely on vision-based systems to avoid hazards as well as plan safe routes. However, vision-based systems operate on the assumption that sufficient visual texture is visible in the scene. This poses a challenge for vision-based navigation on Mars where regions lacking visual texture are prevalent. To overcome this, we make use of the ability of the rover to actively steer the visual sensor to improve fault tolerance and maximize the perception performance. This paper answers the question of where and when to look by presenting a method for predicting the sensor trajectory that maximizes the localization performance of the rover. This is accomplished by an online assessment of possible trajectories using synthetic, future camera views created from previous observations of the scene. The proposed trajectories are quantified and chosen based on the expected localization performance. In this work, we validate the proposed method in field experiments at the Jet Propulsion Laboratory (JPL) Mars Yard. Furthermore, multiple performance metrics are identified and evaluated for reducing the overall runtime of the algorithm. We show how actively steering the perception system increases the localization accuracy compared to traditional fixed-sensor configurations.
READ FULL TEXT