Dimension of Activity in Random Neural Networks

by   David G. Clark, et al.
Columbia University

Neural networks are high-dimensional nonlinear dynamical systems that process information through the coordinated activity of many interconnected units. Understanding how biological and machine-learning networks function and learn requires knowledge of the structure of this coordinated activity, information contained in cross-covariances between units. Although dynamical mean field theory (DMFT) has elucidated several features of random neural networks – in particular, that they can generate chaotic activity – existing DMFT approaches do not support the calculation of cross-covariances. We solve this longstanding problem by extending the DMFT approach via a two-site cavity method. This reveals, for the first time, several spatial and temporal features of activity coordination, including the effective dimension, defined as the participation ratio of the spectrum of the covariance matrix. Our results provide a general analytical framework for studying the structure of collective activity in random neural networks and, more broadly, in high-dimensional nonlinear dynamical systems with quenched disorder.


page 1

page 2

page 3

page 4


Mastering high-dimensional dynamics with Hamiltonian neural networks

We detail how incorporating physics into neural network design can signi...

Mean Field Theory of Dynamical Systems Driven by External Signals

Dynamical systems driven by strong external signals are ubiquituous in n...

Designing spontaneous behavioral switching via chaotic itinerancy

Chaotic itinerancy is a frequently observed phenomenon in high-dimension...

Tensorized Transformer for Dynamical Systems Modeling

The identification of nonlinear dynamics from observations is essential ...

Dynamical Systems Trees

We propose dynamical systems trees (DSTs) as a flexible class of models ...

Dynamical large deviations of two-dimensional kinetically constrained models using a neural-network state ansatz

We use a neural network ansatz originally designed for the variational o...

Learn Like The Pro: Norms from Theory to Size Neural Computation

The optimal design of neural networks is a critical problem in many appl...

Please sign up or login with your details

Forgot password? Click here to reset