Memory and Information Processing in Recurrent Neural Networks

04/23/2016
by   Alireza Goudarzi, et al.
0

Recurrent neural networks (RNN) are simple dynamical systems whose computational power has been attributed to their short-term memory. Short-term memory of RNNs has been previously studied analytically only for the case of orthogonal networks, and only under annealed approximation, and uncorrelated input. Here for the first time, we present an exact solution to the memory capacity and the task-solving performance as a function of the structure of a given network instance, enabling direct determination of the function--structure relation in RNNs. We calculate the memory capacity for arbitrary networks with exponentially correlated input and further related it to the performance of the system on signal processing tasks in a supervised learning setup. We compute the expected error and the worst-case error bound as a function of the spectra of the network and the correlation structure of its inputs and outputs. Our results give an explanation for learning and generalization of task solving using short-term memory, which is crucial for building alternative computer architectures using physical phenomena based on the short-term memory principle.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2018

Fast Weight Long Short-Term Memory

Associative memory using fast weights is a short-term memory mechanism t...
research
12/24/2019

Optimal short-term memory before the edge of chaos in driven random recurrent networks

The ability of discrete-time nonlinear recurrent neural networks to stor...
research
02/14/2018

Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks

The inherent transient dynamics of recurrent neural networks (RNNs) have...
research
05/26/2016

Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks

Recurrent neural networks (RNNs) have drawn interest from machine learni...
research
11/17/2022

Learning to Control Rapidly Changing Synaptic Connections: An Alternative Type of Memory in Sequence Processing Artificial Neural Networks

Short-term memory in standard, general-purpose, sequence-processing recu...
research
07/01/2013

Short Term Memory Capacity in Networks via the Restricted Isometry Property

Cortical networks are hypothesized to rely on transient network activity...
research
03/11/2016

Determination of the edge of criticality in echo state networks through Fisher information maximization

It is a widely accepted fact that the computational capability of recurr...

Please sign up or login with your details

Forgot password? Click here to reset