Continuous parameter working memory in a balanced chaotic neural network

08/27/2015
by   Nimrod Shaham, et al.
0

It has been proposed that neural noise in the cortex arises from chaotic dynamics in the balanced state: in this model of cortical dynamics, the excitatory and inhibitory inputs to each neuron approximately cancel, and activity is driven by fluctuations of the synaptic inputs around their mean. It remains unclear whether neural networks in the balanced state can perform tasks that are highly sensitive to noise, such as storage of continuous parameters in working memory, while also accounting for the irregular behavior of single neurons. Here we show that continuous parameter working memory can be maintained in the balanced state, in a neural circuit with a simple network architecture. We show analytically that in the limit of an infinite network, the dynamics generated by this architecture are characterized by a continuous set of steady balanced states, allowing for the indefinite storage of a continuous parameter. In finite networks, we show that the chaotic noise drives diffusive motion along the approximate attractor, which gradually degrades the stored memory. We analyze the dynamics and show that the slow diffusive motion induces slowly decaying temporal cross correlations in the activity, which differ substantially from those previously described in the balanced state. We calculate the diffusivity, and show that it is inversely proportional to the system size. For large enough (but realistic) neural population sizes, and with suitable tuning of the network connections, the proposed balanced network can sustain continuous parameter values in memory over time scales larger by several orders of magnitude than the single neuron time scale.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2017

Balanced Excitation and Inhibition are Required for High-Capacity, Noise-Robust Neuronal Selectivity

Neurons and networks in the cerebral cortex must operate reliably despit...
research
01/24/2022

Input correlations impede suppression of chaos and learning in balanced rate networks

Neural circuits exhibit complex activity patterns, both spontaneously an...
research
07/28/2020

A new GPU library for fast simulation of large-scale networks of spiking neurons

Over the past decade there has been a growing interest in the developmen...
research
08/24/2023

Persistent learning signals and working memory without continuous attractors

Neural dynamical systems with stable attractor structures, such as point...
research
03/29/2022

Artificial Intelligence Software Structured to Simulate Human Working Memory, Mental Imagery, and Mental Continuity

This article presents an artificial intelligence (AI) architecture inten...
research
08/19/2022

Kernel Memory Networks: A Unifying Framework for Memory Modeling

We consider the problem of training a neural network to store a set of p...

Please sign up or login with your details

Forgot password? Click here to reset