A Correspondence Between Random Neural Networks and Statistical Field Theory

10/18/2017
by   Samuel S. Schoenholz, et al.
0

A number of recent papers have provided evidence that practical design questions about neural networks may be tackled theoretically by studying the behavior of random networks. However, until now the tools available for analyzing random neural networks have been relatively ad-hoc. In this work, we show that the distribution of pre-activations in random neural networks can be exactly mapped onto lattice models in statistical physics. We argue that several previous investigations of stochastic networks actually studied a particular factorial approximation to the full lattice model. For random linear networks and random rectified linear networks we show that the corresponding lattice models in the wide network limit may be systematically approximated by a Gaussian distribution with covariance between the layers of the network. In each case, the approximate distribution can be diagonalized by Fourier transformation. We show that this approximation accurately describes the results of numerical simulations of wide random neural networks. Finally, we demonstrate that in each case the large scale behavior of the random networks can be approximated by an effective field theory.

READ FULL TEXT
research
12/22/2022

Renormalization in the neural network-quantum field theory correspondence

A statistical ensemble of neural networks can be described in terms of a...
research
12/21/2021

Preserving gauge invariance in neural networks

In these proceedings we present lattice gauge equivariant convolutional ...
research
08/03/2021

Nonperturbative renormalization for the neural network-QFT correspondence

In a recent work arXiv:2008.08601, Halverson, Maiti and Stoner proposed ...
research
05/05/2023

Equivariant Neural Networks for Spin Dynamics Simulations of Itinerant Magnets

I present a novel equivariant neural network architecture for the large-...
research
05/17/2022

Universal characteristics of deep neural network loss surfaces from random matrix theory

This paper considers several aspects of random matrix universality in de...
research
09/26/2022

Representative volume element approximations in elastoplastic spring networks

We study the large-scale behavior of a small-strain lattice model for a ...
research
06/03/2023

Random matrix theory and the loss surfaces of neural networks

Neural network models are one of the most successful approaches to machi...

Please sign up or login with your details

Forgot password? Click here to reset