Learning at the Ends: From Hand to Tool Affordances in Humanoid Robots

04/09/2018
by   Giovanni Saponaro, et al.
0

One of the open challenges in designing robots that operate successfully in the unpredictable human environment is how to make them able to predict what actions they can perform on objects, and what their effects will be, i.e., the ability to perceive object affordances. Since modeling all the possible world interactions is unfeasible, learning from experience is required, posing the challenge of collecting a large amount of experiences (i.e., training data). Typically, a manipulative robot operates on external objects by using its own hands (or similar end-effectors), but in some cases the use of tools may be desirable, nevertheless, it is reasonable to assume that while a robot can collect many sensorimotor experiences using its own hands, this cannot happen for all possible human-made tools. Therefore, in this paper we investigate the developmental transition from hand to tool affordances: what sensorimotor skills that a robot has acquired with its bare hands can be employed for tool use? By employing a visual and motor imagination mechanism to represent different hand postures compactly, we propose a probabilistic model to learn hand affordances, and we show how this model can generalize to estimate the affordances of previously unseen tools, ultimately supporting planning, decision-making and tool selection tasks in humanoid robots. We present experimental results with the iCub humanoid robot, and we publicly release the collected sensorimotor data in the form of a hand posture affordances dataset.

READ FULL TEXT

page 1

page 4

page 5

research
09/23/2018

Detecting Features of Tools, Objects, and Actions from Effects in a Robot using Deep Learning

We propose a tool-use model that can detect the features of tools, targe...
research
04/11/2019

Improvisation through Physical Understanding: Using Novel Objects as Tools with Visual Foresight

Machine learning techniques have enabled robots to learn narrow, yet com...
research
12/25/2020

Teaching Robots Novel Objects by Pointing at Them

Robots that must operate in novel environments and collaborate with huma...
research
06/18/2022

ToolTango: Common sense Generalization in Predicting Sequential Tool Interactions for Robot Plan Synthesis

Robots assisting us in environments such as factories or homes must lear...
research
05/05/2021

TANGO: Commonsense Generalization in Predicting Tool Interactions for Mobile Manipulators

Robots assisting us in factories or homes must learn to make use of obje...
research
03/07/2023

Cross-Tool and Cross-Behavior Perceptual Knowledge Transfer for Grounded Object Recognition

Humans learn about objects via interaction and using multiple perception...
research
06/09/2020

ToolNet: Using Commonsense Generalization for Predicting Tool Use for Robot Plan Synthesis

A robot working in a physical environment (like home or factory) needs t...

Please sign up or login with your details

Forgot password? Click here to reset