Cross-Tool and Cross-Behavior Perceptual Knowledge Transfer for Grounded Object Recognition

03/07/2023
by   Gyan Tatiya, et al.
0

Humans learn about objects via interaction and using multiple perceptions, such as vision, sound, and touch. While vision can provide information about an object's appearance, non-visual sensors, such as audio and haptics, can provide information about its intrinsic properties, such as weight, temperature, hardness, and the object's sound. Using tools to interact with objects can reveal additional object properties that are otherwise hidden (e.g., knives and spoons can be used to examine the properties of food, including its texture and consistency). Robots can use tools to interact with objects and gather information about their implicit properties via non-visual sensors. However, a robot's model for recognizing objects using a tool-mediated behavior does not generalize to a new tool or behavior due to differing observed data distributions. To address this challenge, we propose a framework to enable robots to transfer implicit knowledge about granular objects across different tools and behaviors. The proposed approach learns a shared latent space from multiple robots' contexts produced by respective sensory data while interacting with objects using tools. We collected a dataset using a UR5 robot that performed 5,400 interactions using 6 tools and 6 behaviors on 15 granular objects and tested our method on cross-tool and cross-behavioral transfer tasks. Our results show the less experienced target robot can benefit from the experience gained from the source robot and perform recognition on a set of novel objects. We have released the code, datasets, and additional results: https://github.com/gtatiya/Tool-Knowledge-Transfer.

READ FULL TEXT

page 1

page 3

page 4

research
09/14/2022

Transferring Implicit Knowledge of Non-Visual Object Properties Across Heterogeneous Robot Morphologies

Humans leverage multiple sensor modalities when interacting with objects...
research
06/04/2021

How to select and use tools? : Active Perception of Target Objects Using Multimodal Deep Learning

Selection of appropriate tools and use of them when performing daily tas...
research
09/15/2023

MOSAIC: Learning Unified Multi-Sensory Object Property Representations for Robot Perception

A holistic understanding of object properties across diverse sensory mod...
research
04/09/2018

Learning at the Ends: From Hand to Tool Affordances in Humanoid Robots

One of the open challenges in designing robots that operate successfully...
research
10/20/2020

Object Permanence Through Audio-Visual Representations

As robots perform manipulation tasks and interact with objects, it is pr...
research
08/20/2020

Object Properties Inferring from and Transfer for Human Interaction Motions

Humans regularly interact with their surrounding objects. Such interacti...
research
10/13/2017

Transfer of Tool Affordance and Manipulation Cues with 3D Vision Data

Future service robots working in human environments, such as kitchens, w...

Please sign up or login with your details

Forgot password? Click here to reset