Transferring Implicit Knowledge of Non-Visual Object Properties Across Heterogeneous Robot Morphologies

09/14/2022
by   Gyan Tatiya, et al.
0

Humans leverage multiple sensor modalities when interacting with objects and discovering their intrinsic properties. Using the visual modality alone is insufficient for deriving intuition behind object properties (e.g., which of two boxes is heavier), making it essential to consider non-visual modalities as well, such as the tactile and auditory. Whereas robots may leverage various modalities to obtain object property understanding via learned exploratory interactions with objects (e.g., grasping, lifting, and shaking behaviors), challenges remain: the implicit knowledge acquired by one robot via object exploration cannot be directly leveraged by another robot with different morphology, because the sensor models, observed data distributions, and interaction capabilities are different across these different robot configurations. To avoid the costly process of learning interactive object perception tasks from scratch, we propose a multi-stage projection framework for each new robot for transferring implicit knowledge of object properties across heterogeneous robot morphologies. We evaluate our approach on the object-property recognition and object-identity recognition tasks, using a dataset containing two heterogeneous robots that perform 7,600 object interactions. Results indicate that knowledge can be transferred across robots, such that a newly-deployed robot can bootstrap its recognition models without exhaustively exploring all objects. We also propose a data augmentation technique and show that this technique improves the generalization of models. We release our code and datasets, here: https://github.com/gtatiya/Implicit-Knowledge-Transfer.

READ FULL TEXT

page 1

page 3

page 5

research
03/07/2023

Cross-Tool and Cross-Behavior Perceptual Knowledge Transfer for Grounded Object Recognition

Humans learn about objects via interaction and using multiple perception...
research
09/15/2023

MOSAIC: Learning Unified Multi-Sensory Object Property Representations for Robot Perception

A holistic understanding of object properties across diverse sensory mod...
research
09/16/2021

ObjectFolder: A Dataset of Objects with Implicit Visual, Auditory, and Tactile Representations

Multisensory object-centric perception, reasoning, and interaction have ...
research
11/02/2017

Active Clothing Material Perception using Tactile Sensing and Deep Learning

Humans represent the objects in the same category using their properties...
research
07/03/2018

Leveraging Robotic Prior Tactile Exploratory Action Experiences For Learning New Objects's Physical Properties

Reusing the tactile knowledge of some previously-explored objects helps ...
research
03/29/2022

OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction

Learning how humans manipulate objects requires machines to acquire know...
research
06/26/2019

From Multi-modal Property Dataset to Robot-centric Conceptual Knowledge About Household Objects

Tool-use applications in robotics require conceptual knowledge about obj...

Please sign up or login with your details

Forgot password? Click here to reset