Dimension of Activity in Random Neural Networks

07/25/2022
by   David G. Clark, et al.
0

Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many interconnected units. Understanding how biological and machine-learning networks function and learn requires knowledge of the structure of this coordinated activity, information contained in cross-covariances between units. Although dynamical mean field theory (DMFT) has elucidated several features of random neural networks – in particular, that they can generate chaotic activity – existing DMFT approaches do not support the calculation of cross-covariances. We solve this longstanding problem by extending the DMFT approach via a two-site cavity method. This reveals, for the first time, several spatial and temporal features of activity coordination, including the effective dimension, defined as the participation ratio of the spectrum of the covariance matrix. Our results provide a general analytical framework for studying the structure of collective activity in random neural networks and, more broadly, in high-dimensional nonlinear dynamical systems with quenched disorder.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/28/2020

Mastering high-dimensional dynamics with Hamiltonian neural networks

We detail how incorporating physics into neural network design can signi...
research
10/31/2012

Mean Field Theory of Dynamical Systems Driven by External Signals

Dynamical systems driven by strong external signals are ubiquituous in n...
research
02/13/2020

Designing spontaneous behavioral switching via chaotic itinerancy

Chaotic itinerancy is a frequently observed phenomenon in high-dimension...
research
06/05/2020

Tensorized Transformer for Dynamical Systems Modeling

The identification of nonlinear dynamics from observations is essential ...
research
07/11/2012

Dynamical Systems Trees

We propose dynamical systems trees (DSTs) as a flexible class of models ...
research
11/17/2020

Dynamical large deviations of two-dimensional kinetically constrained models using a neural-network state ansatz

We use a neural network ansatz originally designed for the variational o...
research
06/21/2021

Learn Like The Pro: Norms from Theory to Size Neural Computation

The optimal design of neural networks is a critical problem in many appl...

Please sign up or login with your details

Forgot password? Click here to reset