The edge of chaos: quantum field theory and deep neural networks

09/27/2021
by   Kevin T. Grosvenor, et al.
0

We explicitly construct the quantum field theory corresponding to a general class of deep neural networks encompassing both recurrent and feedforward architectures. We first consider the mean-field theory (MFT) obtained as the leading saddlepoint in the action, and derive the condition for criticality via the largest Lyapunov exponent. We then compute the loop corrections to the correlation function in a perturbative expansion in the ratio of depth T to width N, and find a precise analogy with the well-studied O(N) vector model, in which the variance of the weight initializations plays the role of the 't Hooft coupling. In particular, we compute both the 𝒪(1) corrections quantifying fluctuations from typicality in the ensemble of networks, and the subleading 𝒪(T/N) corrections due to finite-width effects. These provide corrections to the correlation length that controls the depth to which information can propagate through the network, and thereby sets the scale at which such networks are trainable by gradient descent. Our analysis provides a first-principles approach to the rapidly emerging NN-QFT correspondence, and opens several interesting avenues to the study of criticality in deep neural networks.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

12/19/2019

Mean field theory for deep dropout networks: digging up gradient backpropagation deeply

In recent years, the mean field theory has been applied to the study of ...
12/10/2021

Unified Field Theory for Deep and Recurrent Neural Networks

Understanding capabilities and limitations of different network architec...
06/17/2022

Fast Finite Width Neural Tangent Kernel

The Neural Tangent Kernel (NTK), defined as Θ_θ^f(x_1, x_2) = [∂ f(θ, x_...
05/30/2021

Overparameterization of deep ResNet: zero loss and mean-field analysis

Finding parameters in a deep neural network (NN) that fit training data ...
11/20/2020

Normalization effects on shallow neural networks and related asymptotic expansions

We consider shallow (single hidden layer) neural networks and characteri...
07/14/2021

Towards quantifying information flows: relative entropy in deep neural networks and the renormalization group

We investigate the analogy between the renormalization group (RG) and de...
11/04/2016

Deep Information Propagation

We study the behavior of untrained neural networks whose weights and bia...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.