Multi-Modal Active Perception for Information Gathering in Science Missions
Robotic science missions in remote environments, such as deep ocean and outer space, can involve studying phenomena that cannot directly be observed using on-board sensors but must be deduced by combining measurements of correlated variables with domain knowledge. Traditionally, in such missions, robots passively gather data along prescribed paths, while inference, path planning, and other high level decision making is largely performed by a supervisory science team. However, communication constraints hinder these processes, and hence the rate of scientific progress. This paper presents an active perception approach that aims to reduce robots' reliance on human supervision and improve science productivity by encoding scientists' domain knowledge and decision making process on-board. We use Bayesian networks to compactly model critical aspects of scientific knowledge while remaining robust to observation and modeling uncertainty. We then formulate path planning and sensor scheduling as an information gain maximization problem, and propose a sampling-based solution based on Monte Carlo tree search to plan informative sensing actions which exploit the knowledge encoded in the network. The computational complexity of our framework does not grow with the number of observations taken and allows long horizon planning in an anytime manner, making it highly applicable to field robotics. Simulation results show statistically significant performance improvements over baseline methods, and we validate the practicality of our approach through both hardware experiments and simulated experiments with field data gathered during the NASA Mojave Volatiles Prospector science expedition.
READ FULL TEXT