DeepAI AI Chat
Log In Sign Up

Quantifying and maximizing the information flux in recurrent neural networks

01/30/2023
by   Claus Metzner, et al.
0

Free-running Recurrent Neural Networks (RNNs), especially probabilistic models, generate an ongoing information flux that can be quantified with the mutual information I[x⃗(t),x⃗(t+1)] between subsequent system states x⃗. Although, former studies have shown that I depends on the statistics of the network's connection weights, it is unclear (1) how to maximize I systematically and (2) how to quantify the flux in large systems where computing the mutual information becomes intractable. Here, we address these questions using Boltzmann machines as model systems. We find that in networks with moderately strong connections, the mutual information I is approximately a monotonic transformation of the root-mean-square averaged Pearson correlations between neuron-pairs, a quantity that can be efficiently computed even in large systems. Furthermore, evolutionary maximization of I[x⃗(t),x⃗(t+1)] reveals a general design principle for the weight matrices enabling the systematic construction of systems with a high spontaneous information flux. Finally, we simultaneously maximize information flux and the mean period length of cyclic attractors in the state space of these dynamical networks. Our results are potentially useful for the construction of RNNs that serve as short-time memories or pattern generators.

READ FULL TEXT

page 19

page 20

page 21

05/10/2019

Mutual Information Scaling and Expressive Power of Sequence Models

Sequence models assign probabilities to variable-length sequences such a...
02/14/2018

Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks

The inherent transient dynamics of recurrent neural networks (RNNs) have...
12/08/2020

Mutual Information Decay Curves and Hyper-Parameter Grid Search Design for Recurrent Neural Architectures

We present an approach to design the grid searches for hyper-parameter o...
06/03/2021

MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood Inference from Sampled Trajectories

Simulation-based inference enables learning the parameters of a model ev...
06/21/2016

Criticality in Formal Languages and Statistical Physics

We show that the mutual information between two symbols, as a function o...
09/07/2021

A Biologically Plausible Learning Rule for Perceptual Systems of organisms that Maximize Mutual Information

It is widely believed that the perceptual system of an organism is optim...
10/23/2021

Stochastic facilitation in heteroclinic communication channels

Biological neural systems encode and transmit information as patterns of...