Adaptive robot body learning and estimation through predictive coding

05/08/2018
by   Pablo Lanillos, et al.
0

The predictive functions that permit humans to infer their body state by sensorimotor integration are critical to deploy safe interaction in complex environments. These functions are adaptive and robust to non-linear actuators and noisy sensory information. This paper presents a powerful and scalable computational perceptual model based on predictive processing that enables any multisensory robot to learn, infer and update its body configuration when using arbitrary sensors with Gaussian additive noise. The proposed method integrates different sources of information (tactile, visual and proprioceptive) to drive the robot belief to its current body configuration. The motivation is to enable robots with the embodied perception needed for self-calibration and safe physical human-robot interaction. We formulate body learning obtaining the function that encodes the sensory consequences of the body configuration and its partial derivative with respect to body variables, and we solve it by Gaussian process regression. We model body estimation as minimizing the discrepancy between the robot body configuration belief and the observed posterior. We minimize the variational free energy using the sensory prediction errors (sensed vs expected). In order to evaluate the model we test it on a real multisensory robotic arm. We show how different sensor modalities contributions, included as additive errors, improve the refinement of the body estimation and how the system adapts itself to provide the most plausible solution even when injecting strong sensory visuo-tactile perturbations. We further analyse the reliability of the model when different sensor modalities are disabled. This provides grounded evidence about the correctness of the perceptual model and shows how the robot estimates and adjusts its body configuration just by means of sensory information.

READ FULL TEXT

page 2

page 6

page 10

research
04/11/2020

Robot self/other distinction: active inference meets neural networks learning in a mirror

Self/other distinction and self-recognition are important skills for int...
research
06/08/2017

Where is my forearm? Clustering of body parts from simultaneous tactile and linguistic input using sequential mapping

Humans and animals are constantly exposed to a continuous stream of sens...
research
02/27/2022

MOCA-S: A Sensitive Mobile Collaborative Robotic Assistant exploiting Low-Cost Capacitive Tactile Cover and Whole-Body Control

Safety is one of the most fundamental aspects of robotics, especially wh...
research
06/30/2020

A Framework for Learning Invariant Physical Relations in Multimodal Sensory Processing

Perceptual learning enables humans to recognize and represent stimuli in...
research
01/15/2019

Sensorimotor learning for artificial body perception

Artificial self-perception is the machine ability to perceive its own bo...
research
07/25/2018

Adaptivity to Enable an Efficient and Robust Human Intranet

The Human Intranet is envisioned as an open, scalable platform that seam...

Please sign up or login with your details

Forgot password? Click here to reset