Determination of the edge of criticality in echo state networks through Fisher information maximization

03/11/2016
by   Lorenzo Livi, et al.
0

It is a widely accepted fact that the computational capability of recurrent neural networks is maximized on the so-called "edge of criticality". Once the network operates in this configuration, it performs efficiently on a specific application both in terms of (i) low prediction error and (ii) high short-term memory capacity. Since the behavior of recurrent networks is strongly influenced by the particular input signal driving the dynamics, a universal, application-independent method for determining the edge of criticality is still missing. In this paper, we aim at addressing this issue by proposing a theoretically motivated, unsupervised method based on Fisher information for determining the edge of criticality in recurrent neural networks. It is proven that Fisher information is maximized for (finite-size) systems operating in such critical regions. However, Fisher information is notoriously difficult to compute and either requires the probability density function or the conditional dependence of the system states with respect to the model parameters. The paper takes advantage of a recently-developed non-parametric estimator of the Fisher information matrix and provides a method to determine the critical region of echo state networks, a particular class of recurrent networks. The considered control parameters, which indirectly affect the echo state network performance, are explored to identify those configurations lying on the edge of criticality and, as such, maximizing Fisher information and computational performance. Experimental results on benchmarks and real-world data demonstrate the effectiveness of the proposed method.

READ FULL TEXT
research
12/24/2019

Optimal short-term memory before the edge of chaos in driven random recurrent networks

The ability of discrete-time nonlinear recurrent neural networks to stor...
research
11/18/2019

Eigenvalue Normalized Recurrent Neural Networks for Short Term Memory

Several variants of recurrent neural networks (RNNs) with orthogonal or ...
research
04/23/2016

Memory and Information Processing in Recurrent Neural Networks

Recurrent neural networks (RNN) are simple dynamical systems whose compu...
research
10/25/2017

GeoSeq2Seq: Information Geometric Sequence-to-Sequence Networks

The Fisher information metric is an important foundation of information ...
research
09/10/2016

Multiplex visibility graphs to investigate recurrent neural networks dynamics

A recurrent neural network (RNN) is a universal approximator of dynamica...
research
03/27/2019

Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere

Among the various architectures of Recurrent Neural Networks, Echo State...
research
06/05/2017

Neuroevolution on the Edge of Chaos

Echo state networks represent a special type of recurrent neural network...

Please sign up or login with your details

Forgot password? Click here to reset