Unified Field Theory for Deep and Recurrent Neural Networks

12/10/2021
by   Kai Segadlo, et al.
0

Understanding capabilities and limitations of different network architectures is of fundamental importance to machine learning. Bayesian inference on Gaussian processes has proven to be a viable approach for studying recurrent and deep networks in the limit of infinite layer width, n→∞. Here we present a unified and systematic derivation of the mean-field theory for both architectures that starts from first principles by employing established methods from statistical physics of disordered systems. The theory elucidates that while the mean-field equations are different with regard to their temporal structure, they yet yield identical Gaussian kernels when readouts are taken at a single time point or layer, respectively. Bayesian inference applied to classification then predicts identical performance and capabilities for the two architectures. Numerically, we find that convergence towards the mean-field theory is typically slower for recurrent networks than for deep networks and the convergence speed depends non-trivially on the parameters of the weight prior as well as the depth or number of time steps, respectively. Our method exposes that Gaussian processes are but the lowest order of a systematic expansion in 1/n. The formalism thus paves the way to investigate the fundamental differences between recurrent and deep architectures at finite widths n.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2020

A Rigorous Framework for the Mean Field Limit of Multilayer Neural Networks

We develop a mathematically rigorous framework for multilayer neural net...
research
09/30/2019

Non-Gaussian processes and neural networks at finite widths

Gaussian processes are ubiquitous in nature and engineering. A case in p...
research
04/03/2023

Depth Separation with Multilayer Mean-Field Networks

Depth separation – why a deeper network is more powerful than a shallowe...
research
09/27/2021

The edge of chaos: quantum field theory and deep neural networks

We explicitly construct the quantum field theory corresponding to a gene...
research
03/12/2020

Towards a General Theory of Infinite-Width Limits of Neural Classifiers

Obtaining theoretical guarantees for neural networks training appears to...
research
06/10/2022

Dynamic mean field programming

A dynamic mean field theory is developed for model based Bayesian reinfo...
research
12/14/2022

Affine Monads and Lazy Structures for Bayesian Programming

We show that streams and lazy data structures are a natural idiom for pr...

Please sign up or login with your details

Forgot password? Click here to reset