Towards Modeling the Interaction of Spatial-Associative Neural Network Representations for Multisensory Perception

by   German I. Parisi, et al.

Our daily perceptual experience is driven by different neural mechanisms that yield multisensory interaction as the interplay between exogenous stimuli and endogenous expectations. While the interaction of multisensory cues according to their spatiotemporal properties and the formation of multisensory feature-based representations have been widely studied, the interaction of spatial-associative neural representations has received considerably less attention. In this paper, we propose a neural network architecture that models the interaction of spatial-associative representations to perform causal inference of audiovisual stimuli. We investigate the spatial alignment of exogenous audiovisual stimuli modulated by associative congruence. In the spatial layer, topographically arranged networks account for the interaction of audiovisual input in terms of population codes. In the associative layer, congruent audiovisual representations are obtained via the experience-driven development of feature-based associations. Levels of congruency are obtained as a by-product of the neurodynamics of self-organizing networks, where the amount of neural activation triggered by the input can be expressed via a nonlinear distance function. Our novel proposal is that activity-driven levels of congruency can be used as top-down modulatory projections to spatially distributed representations of sensory input, e.g. semantically related audiovisual pairs will yield a higher level of integration than unrelated pairs. Furthermore, levels of neural response in unimodal layers may be seen as sensory reliability for the dynamic weighting of crossmodal cues. We describe a series of planned experiments to validate our model in the tasks of multisensory interaction on the basis of semantic congruence and unimodal cue reliability.


page 1

page 2

page 3

page 4


Closing the loop on multisensory interactions: A neural architecture for multisensory causal inference and recalibration

When the brain receives input from multiple sensory systems, it is faced...

A Framework for Learning Invariant Physical Relations in Multimodal Sensory Processing

Perceptual learning enables humans to recognize and represent stimuli in...

Matching Representations of Explainable Artificial Intelligence and Eye Gaze for Human-Machine Interaction

Rapid non-verbal communication of task-based stimuli is a challenge in h...

Towards a perceptual distance metric for auditory stimuli

Although perceptual (dis)similarity between sensory stimuli seems akin t...

Assessing the Contribution of Semantic Congruency to Multisensory Integration and Conflict Resolution

The efficient integration of multisensory observations is a key property...

Exploring crossmodal perceptual enhancement and integration in a sequence-reproducing task with cognitive priming

Leveraging the perceptual phenomenon of crossmoal correspondence has bee...

Online computation of sparse representations of time varying stimuli using a biologically motivated neural network

Natural stimuli are highly redundant, possessing significant spatial and...

Please sign up or login with your details

Forgot password? Click here to reset