Neural Networks and Quantum Field Theory

08/19/2020
by   James Halverson, et al.
0

We propose a theoretical understanding of neural networks in terms of Wilsonian effective field theory. The correspondence relies on the fact that many asymptotic neural networks are drawn from Gaussian processes, the analog of non-interacting field theories. Moving away from the asymptotic limit yields a non-Gaussian process and corresponds to turning on particle interactions, allowing for the computation of correlation functions of neural network outputs with Feynman diagrams. Minimal non-Gaussian process likelihoods are determined by the most relevant non-Gaussian terms, according to the flow in their coefficients induced by the Wilsonian renormalization group. This yields a direct connection between overparameterization and simplicity of neural network likelihoods. Whether the coefficients are constants or functions may be understood in terms of GP limit symmetries, as expected from 't Hooft's technical naturalness. General theoretical calculations are matched to neural network experiments in the simplest class of models allowing the correspondence. Our formalism is valid for any of the many architectures that becomes a GP in an asymptotic limit, a property preserved under certain types of training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2022

Renormalization in the neural network-quantum field theory correspondence

A statistical ensemble of neural networks can be described in terms of a...
research
04/06/2023

Wide neural networks: From non-gaussian random fields at initialization to the NTK geometry of training

Recent developments in applications of artificial neural networks with o...
research
08/03/2021

Nonperturbative renormalization for the neural network-QFT correspondence

In a recent work arXiv:2008.08601, Halverson, Maiti and Stoner proposed ...
research
07/06/2023

Neural Network Field Theories: Non-Gaussianity, Actions, and Locality

Both the path integral measure in field theory and ensembles of neural n...
research
05/17/2022

Deep neural networks with dependent weights: Gaussian Process mixture limit, heavy tails, sparsity and compressibility

This article studies the infinite-width limit of deep feedforward neural...
research
06/18/2020

Exact posterior distributions of wide Bayesian neural networks

Recent work has shown that the prior over functions induced by a deep Ba...
research
10/06/2020

Fixing Asymptotic Uncertainty of Bayesian Neural Networks with Infinite ReLU Features

Approximate Bayesian methods can mitigate overconfidence in ReLU network...

Please sign up or login with your details

Forgot password? Click here to reset