Robotic Testbed for Rendezvous and Optical Navigation: Multi-Source Calibration and Machine Learning Use Cases

08/12/2021
by   Tae Ha Park, et al.
5

This work presents the most recent advances of the Robotic Testbed for Rendezvous and Optical Navigation (TRON) at Stanford University - the first robotic testbed capable of validating machine learning algorithms for spaceborne optical navigation. The TRON facility consists of two 6 degrees-of-freedom KUKA robot arms and a set of Vicon motion track cameras to reconfigure an arbitrary relative pose between a camera and a target mockup model. The facility includes multiple Earth albedo light boxes and a sun lamp to recreate the high-fidelity spaceborne illumination conditions. After the overview of the facility, this work details the multi-source calibration procedure which enables the estimation of the relative pose between the object and the camera with millimeter-level position and millidegree-level orientation accuracies. Finally, a comparative analysis of the synthetic and TRON simulated imageries is performed using a Convolutional Neural Network (CNN) pre-trained on the synthetic images. The result shows a considerable gap in the CNN's performance, suggesting the TRON simulated images can be used to validate the robustness of any machine learning algorithms trained on more easily accessible synthetic imagery from computer graphics.

READ FULL TEXT

page 2

page 5

page 6

page 7

page 9

page 14

page 16

page 18

research
10/06/2021

SPEED+: Next Generation Dataset for Spacecraft Pose Estimation across Domain Gap

Autonomous vision-based spaceborne navigation is an enabling technology ...
research
06/08/2022

Adaptive Neural Network-based Unscented Kalman Filter for Spacecraft Pose Tracking at Rendezvous

This paper presents a neural network-based Unscented Kalman Filter (UKF)...
research
06/24/2019

Pose Estimation for Non-Cooperative Rendezvous Using Neural Networks

This work introduces the Spacecraft Pose Network (SPN) for on-board esti...
research
01/19/2021

COTORRA: COntext-aware Testbed fOR Robotic Applications

Edge Fog computing have received considerable attention as promising...
research
07/15/2020

Lunar Terrain Relative Navigation Using a Convolutional Neural Network for Visual Crater Detection

Terrain relative navigation can improve the precision of a spacecraft's ...
research
06/30/2011

Vision-Based Navigation III: Pose and Motion from Omnidirectional Optical Flow and a Digital Terrain Map

An algorithm for pose and motion estimation using corresponding features...
research
02/07/2023

Pole Estimation and Optical Navigation using Circle of Latitude Projections

Images of both rotating celestial bodies (e.g., asteroids) and spheroida...

Please sign up or login with your details

Forgot password? Click here to reset