Learning to Explore Informative Trajectories and Samples for Embodied Perception

03/20/2023
by   Ya Jing, et al.
0

We are witnessing significant progress on perception models, specifically those trained on large-scale internet images. However, efficiently generalizing these perception models to unseen embodied tasks is insufficiently studied, which will help various relevant applications (e.g., home robots). Unlike static perception methods trained on pre-collected images, the embodied agent can move around in the environment and obtain images of objects from any viewpoints. Therefore, efficiently learning the exploration policy and collection method to gather informative training samples is the key to this task. To do this, we first build a 3D semantic distribution map to train the exploration policy self-supervised by introducing the semantic distribution disagreement and the semantic distribution uncertainty rewards. Note that the map is generated from multi-view observations and can weaken the impact of misidentification from an unfamiliar viewpoint. Our agent is then encouraged to explore the objects with different semantic distributions across viewpoints, or uncertain semantic distributions. With the explored informative trajectories, we propose to select hard samples on trajectories based on the semantic distribution uncertainty to reduce unnecessary observations that can be correctly identified. Experiments show that the perception model fine-tuned with our method outperforms the baselines trained with other exploration policies. Further, we demonstrate the robustness of our method in real-robot experiments.

READ FULL TEXT

page 1

page 2

page 6

research
12/02/2021

SEAL: Self-supervised Embodied Active Learning using Exploration and 3D Consistency

In this paper, we explore how we can build upon the data and models of I...
research
09/26/2020

SEMI: Self-supervised Exploration via Multisensory Incongruity

Efficient exploration is a long-standing problem in reinforcement learni...
research
02/07/2023

Look around and learn: self-improving object detection by exploration

Object detectors often experience a drop in performance when new environ...
research
03/14/2023

Robust Fusion for Bayesian Semantic Mapping

The integration of semantic information in a map allows robots to unders...
research
02/26/2023

Perceiving Unseen 3D Objects by Poking the Objects

We present a novel approach to interactive 3D object perception for robo...
research
06/27/2019

Emergence of Exploratory Look-Around Behaviors through Active Observation Completion

Standard computer vision systems assume access to intelligently captured...
research
12/27/2018

Robustness to Out-of-Distribution Inputs via Task-Aware Generative Uncertainty

Deep learning provides a powerful tool for machine perception when the o...

Please sign up or login with your details

Forgot password? Click here to reset