Predictive coding in balanced neural networks with noise, chaos and delays

06/25/2020
by   Jonathan Kadmon, et al.
8

Biological neural networks face a formidable task: performing reliable computations in the face of intrinsic stochasticity in individual neurons, imprecisely specified synaptic connectivity, and nonnegligible delays in synaptic transmission. A common approach to combatting such biological heterogeneity involves averaging over large redundant networks of N neurons resulting in coding errors that decrease classically as 1/√(N). Recent work demonstrated a novel mechanism whereby recurrent spiking networks could efficiently encode dynamic stimuli, achieving a superclassical scaling in which coding errors decrease as 1/N. This specific mechanism involved two key ideas: predictive coding, and a tight balance, or cancellation between strong feedforward inputs and strong recurrent feedback. However, the theoretical principles governing the efficacy of balanced predictive coding and its robustness to noise, synaptic weight heterogeneity and communication delays remain poorly understood. To discover such principles, we introduce an analytically tractable model of balanced predictive coding, in which the degree of balance and the degree of weight disorder can be dissociated unlike in previous balanced network models, and we develop a mean field theory of coding accuracy. Overall, our work provides and solves a general theoretical framework for dissecting the differential contributions neural noise, synaptic disorder, chaos, synaptic delays, and balance to the fidelity of predictive neural codes, reveals the fundamental role that balance plays in achieving superclassical scaling, and unifies previously disparate models in theoretical neuroscience.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2023

Beyond Weights: Deep learning in Spiking Neural Networks with pure synaptic-delay training

Biological evidence suggests that adaptation of synaptic delays on short...
research
08/11/2019

Population rate coding in recurrent neuronal networks with undetermined-type neurons

Neural coding is a key problem in neuroscience, which can promote people...
research
04/08/2013

Synaptic Scaling Balances Learning in a Spiking Model of Neocortex

Learning in the brain requires complementary mechanisms: potentiation an...
research
05/03/2017

Balanced Excitation and Inhibition are Required for High-Capacity, Noise-Robust Neuronal Selectivity

Neurons and networks in the cerebral cortex must operate reliably despit...
research
05/16/2021

Bayesian reconstruction of memories stored in neural networks from their connectivity

The advent of comprehensive synaptic wiring diagrams of large neural cir...
research
05/07/2020

Hierarchical Predictive Coding Models in a Deep-Learning Framework

Bayesian predictive coding is a putative neuromorphic method for acquiri...
research
01/24/2022

Input correlations impede suppression of chaos and learning in balanced rate networks

Neural circuits exhibit complex activity patterns, both spontaneously an...

Please sign up or login with your details

Forgot password? Click here to reset