Where is my forearm? Clustering of body parts from simultaneous tactile and linguistic input using sequential mapping

06/08/2017
by   Karla Stepanova, et al.
0

Humans and animals are constantly exposed to a continuous stream of sensory information from different modalities. At the same time, they form more compressed representations like concepts or symbols. In species that use language, this process is further structured by this interaction, where a mapping between the sensorimotor concepts and linguistic elements needs to be established. There is evidence that children might be learning language by simply disambiguating potential meanings based on multiple exposures to utterances in different contexts (cross-situational learning). In existing models, the mapping between modalities is usually found in a single step by directly using frequencies of referent and meaning co-occurrences. In this paper, we present an extension of this one-step mapping and introduce a newly proposed sequential mapping algorithm together with a publicly available Matlab implementation. For demonstration, we have chosen a less typical scenario: instead of learning to associate objects with their names, we focus on body representations. A humanoid robot is receiving tactile stimulations on its body, while at the same time listening to utterances of the body part names (e.g., hand, forearm and torso). With the goal at arriving at the correct "body categories", we demonstrate how a sequential mapping algorithm outperforms one-step mapping. In addition, the effect of data set size and noise in the linguistic input are studied.

READ FULL TEXT

page 4

page 6

page 7

research
05/08/2018

Adaptive robot body learning and estimation through predictive coding

The predictive functions that permit humans to infer their body state by...
research
01/06/2016

Language to Logical Form with Neural Attention

Semantic parsing aims at mapping natural language to machine interpretab...
research
09/21/2023

See to Touch: Learning Tactile Dexterity through Visual Incentives

Equipping multi-fingered robots with tactile sensing is crucial for achi...
research
09/05/2019

The homunculus for proprioception: Toward learning the representation of a humanoid robot's joint space using self-organizing maps

In primate brains, tactile and proprioceptive inputs are relayed to the ...
research
07/03/2018

Leveraging Robotic Prior Tactile Exploratory Action Experiences For Learning New Objects's Physical Properties

Reusing the tactile knowledge of some previously-explored objects helps ...
research
03/30/2021

Probabilistic Analogical Mapping with Semantic Relation Networks

The human ability to flexibly reason with cross-domain analogies depends...
research
05/14/2021

Verification of Size Invariance in DNN Activations using Concept Embeddings

The benefits of deep neural networks (DNNs) have become of interest for ...

Please sign up or login with your details

Forgot password? Click here to reset