Bidirectional Interaction between Visual and Motor Generative Models using Predictive Coding and Active Inference

04/19/2021
by   Louis Annabi, et al.
0

In this work, we build upon the Active Inference (AIF) and Predictive Coding (PC) frameworks to propose a neural architecture comprising a generative model for sensory prediction, and a distinct generative model for motor trajectories. We highlight how sequences of sensory predictions can act as rails guiding learning, control and online adaptation of motor trajectories. We furthermore inquire the effects of bidirectional interactions between the motor and the visual modules. The architecture is tested on the control of a simulated robotic arm learning to reproduce handwritten letters.

READ FULL TEXT
05/27/2020

Goal-Directed Planning for Habituated Agents by Active Inference Using a Variational Recurrent Neural Network

It is crucial to ask how agents can achieve goals by generating action p...
04/02/2020

Neuronal Sequence Models for Bayesian Online Inference

Sequential neuronal activity underlies a wide range of processes in the ...
10/12/2021

The Mirrornet : Learning Audio Synthesizer Controls Inspired by Sensorimotor Interaction

Experiments to understand the sensorimotor neural interactions in the hu...
06/16/2022

Recursive Neural Programs: Variational Learning of Image Grammars and Part-Whole Hierarchies

Human vision involves parsing and representing objects and scenes using ...
01/27/2021

Self-Calibrating Active Binocular Vision via Active Efficient Coding with Deep Autoencoders

We present a model of the self-calibration of active binocular vision co...
04/11/2018

AFA-PredNet: The action modulation within predictive coding

The predictive processing (PP) hypothesizes that the predictive inferenc...