Distributed Sequence Memory of Multidimensional Inputs in Recurrent Networks

05/26/2016
by   Adam Charles, et al.
0

Recurrent neural networks (RNNs) have drawn interest from machine learning researchers because of their effectiveness at preserving past inputs for time-varying data processing tasks. To understand the success and limitations of RNNs, it is critical that we advance our analysis of their fundamental memory properties. We focus on echo state networks (ESNs), which are RNNs with simple memoryless nodes and random connectivity. In most existing analyses, the short-term memory (STM) capacity results conclude that the ESN network size must scale linearly with the input size for unstructured inputs. The main contribution of this paper is to provide general results characterizing the STM capacity for linear ESNs with multidimensional input streams when the inputs have common low-dimensional structure: sparsity in a basis or significant statistical dependence between inputs. In both cases, we show that the number of nodes in the network must scale linearly with the information rate and poly-logarithmically with the ambient input dimension. The analysis relies on advanced applications of random matrix theory and results in explicit non-asymptotic bounds on the recovery error. Taken together, this analysis provides a significant step forward in our understanding of the STM properties in RNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/23/2016

Memory and Information Processing in Recurrent Neural Networks

Recurrent neural networks (RNN) are simple dynamical systems whose compu...
research
07/01/2013

Short Term Memory Capacity in Networks via the Restricted Isometry Property

Cortical networks are hypothesized to rely on transient network activity...
research
10/25/2022

Learning Low Dimensional State Spaces with Overparameterized Recurrent Neural Network

Overparameterization in deep learning typically refers to settings where...
research
05/14/2007

Multi-Dimensional Recurrent Neural Networks

Recurrent neural networks (RNNs) have proved effective at one dimensiona...
research
09/16/2020

On the Curse of Memory in Recurrent Neural Networks: Approximation and Optimization Analysis

We study the approximation properties and optimization dynamics of recur...
research
04/17/2020

How recurrent networks implement contextual processing in sentiment analysis

Neural networks have a remarkable capacity for contextual processing–usi...
research
04/22/2020

Memory and forecasting capacities of nonlinear recurrent networks

The notion of memory capacity, originally introduced for echo state and ...

Please sign up or login with your details

Forgot password? Click here to reset