Slow manifolds in recurrent networks encode working memory efficiently and robustly

01/08/2021
by   Elham Ghazizadeh, et al.
13

Working memory is a cognitive function involving the storage and manipulation of latent information over brief intervals of time, thus making it crucial for context-dependent computation. Here, we use a top-down modeling approach to examine network-level mechanisms of working memory, an enigmatic issue and central topic of study in neuroscience and machine intelligence. We train thousands of recurrent neural networks on a working memory task and then perform dynamical systems analysis on the ensuing optimized networks, wherein we find that four distinct dynamical mechanisms can emerge. In particular, we show the prevalence of a mechanism in which memories are encoded along slow stable manifolds in the network state space, leading to a phasic neuronal activation profile during memory periods. In contrast to mechanisms in which memories are directly encoded at stable attractors, these networks naturally forget stimuli over time. Despite this seeming functional disadvantage, they are more efficient in terms of how they leverage their attractor landscape and paradoxically, are considerably more robust to noise. Our results provide new dynamical hypotheses regarding how working memory function is encoded in both natural and artificial neural networks.

READ FULL TEXT

page 4

page 6

page 12

page 16

page 21

page 25

page 28

page 38

research
12/09/2022

Emergent Computations in Trained Artificial Neural Networks and Real Brains

Synaptic plasticity allows cortical circuits to learn new tasks and to a...
research
08/24/2023

Persistent learning signals and working memory without continuous attractors

Neural dynamical systems with stable attractor structures, such as point...
research
03/24/2020

Input representation in recurrent neural networks dynamics

Reservoir computing is a popular approach to design recurrent neural net...
research
12/23/2016

A State Space Approach for Piecewise-Linear Recurrent Neural Networks for Reconstructing Nonlinear Dynamics from Neural Measurements

The computational properties of neural systems are often thought to be i...
research
03/29/2022

A Computational Architecture for Machine Consciousness and Artificial Superintelligence: Updating Working Memory Iteratively

This theoretical article examines how to construct human-like working me...
research
03/02/2021

On the Memory Mechanism of Tensor-Power Recurrent Models

Tensor-power (TP) recurrent model is a family of non-linear dynamical sy...
research
01/03/2022

Biased Hypothesis Formation From Projection Pursuit

The effect of bias on hypothesis formation is characterized for an autom...

Please sign up or login with your details

Forgot password? Click here to reset