Body models in humans, animals, and robots

10/19/2020
by   Matej Hoffmann, et al.
0

Humans and animals excel in combining information from multiple sensory modalities, controlling their complex bodies, adapting to growth, failures, or using tools. These capabilities are also highly desirable in robots. They are displayed by machines to some extent - yet, as is so often the case, the artificial creatures are lagging behind. The key foundation is an internal representation of the body that the agent - human, animal, or robot - has developed. In the biological realm, evidence has been accumulated by diverse disciplines giving rise to the concepts of body image, body schema, and others. In robotics, a model of the robot is an indispensable component that enables to control the machine. In this article I compare the character of body representations in biology with their robotic counterparts and relate that to the differences in performance that we observe. I put forth a number of axes regarding the nature of such body models: fixed vs. plastic, amodal vs. modal, explicit vs. implicit, serial vs. parallel, modular vs. holistic, and centralized vs. distributed. An interesting trend emerges: on many of the axes, there is a sequence from robot body models, over body image, body schema, to the body representation in lower animals like the octopus. In some sense, robots have a lot in common with Ian Waterman - "the man who lost his body" - in that they rely on an explicit, veridical body model (body image taken to the extreme) and lack any implicit, multimodal representation (like the body schema) of their bodies. I will then detail how robots can inform the biological sciences dealing with body representations and finally, I will study which of the features of the "body in the brain" should be transferred to robots, giving rise to more adaptive and resilient, self-calibrating machines.

READ FULL TEXT

page 5

page 10

page 19

research
11/06/2022

Learning body models: from humans to humanoids

Humans and animals excel in combining information from multiple sensory ...
research
11/25/2020

Sensorimotor representation learning for an "active self" in robots: A model survey

Safe human-robot interactions require robots to be able to learn how to ...
research
01/20/2022

Body Models in Humans and Robots

Neurocognitive models of higher-level somatosensory processing have emph...
research
02/02/2012

The implications of embodiment for behavior and cognition: animal and robotic case studies

In this paper, we will argue that if we want to understand the function ...
research
03/31/2021

Enhancing human bodies with extra robotic arms and fingers: The Neural Resource Allocation Problem

The emergence of robot-based body augmentation promises exciting innovat...
research
01/15/2018

Robots as Powerful Allies for the Study of Embodied Cognition from the Bottom Up

A large body of compelling evidence has been accumulated demonstrating t...
research
09/05/2019

The homunculus for proprioception: Toward learning the representation of a humanoid robot's joint space using self-organizing maps

In primate brains, tactile and proprioceptive inputs are relayed to the ...

Please sign up or login with your details

Forgot password? Click here to reset