The homunculus for proprioception: Toward learning the representation of a humanoid robot's joint space using self-organizing maps

09/05/2019
by   Filipe Gama, et al.
0

In primate brains, tactile and proprioceptive inputs are relayed to the somatosensory cortex which is known for somatotopic representations, or, "homunculi". Our research centers on understanding the mechanisms of the formation of these and more higher-level body representations (body schema) by using humanoid robots and neural networks to construct models. We specifically focus on how spatial representation of the body may be learned from somatosensory information in self-touch configurations. In this work, we target the representation of proprioceptive inputs, which we take to be joint angles in the robot. The inputs collected in different body postures serve as inputs to a Self-Organizing Map (SOM) with a 2D lattice on the output. With unrestricted, all-to-all connections, the map is not capable of representing the input space while preserving the topological relationships, because the intrinsic dimensionality of the body posture space is too large. Hence, we use a method we developed previously for tactile inputs (Hoffmann, Straka et al. 2018) called MRF-SOM, where the Maximum Receptive Field of output neurons is restricted so they only learn to represent specific parts of the input space. This is in line with the receptive fields of neurons in somatosensory areas representing proprioception that often respond to combination of few joints (e.g. wrist and elbow).

READ FULL TEXT

page 1

page 2

research
07/20/2016

The encoding of proprioceptive inputs in the brain: knowns and unknowns from a robotic perspective

Somatosensory inputs can be grossly divided into tactile (or cutaneous) ...
research
08/31/2020

Active exploration for body model learning through self-touch on a humanoid robot with artificial skin

The mechanisms of infant development are far from understood. Learning a...
research
11/06/2022

Learning body models: from humans to humanoids

Humans and animals excel in combining information from multiple sensory ...
research
10/16/2018

Deep Neural Maps

We introduce a new unsupervised representation learning and visualizatio...
research
06/08/2017

Where is my forearm? Clustering of body parts from simultaneous tactile and linguistic input using sequential mapping

Humans and animals are constantly exposed to a continuous stream of sens...
research
01/20/2022

Body Models in Humans and Robots

Neurocognitive models of higher-level somatosensory processing have emph...
research
10/19/2020

Body models in humans, animals, and robots

Humans and animals excel in combining information from multiple sensory ...

Please sign up or login with your details

Forgot password? Click here to reset