Non-asymptotic approximations of neural networks by Gaussian processes

02/17/2021
by   Ronen Eldan, et al.
0

We study the extent to which wide neural networks may be approximated by Gaussian processes when initialized with random weights. It is a well-established fact that as the width of a network goes to infinity, its law converges to that of a Gaussian process. We make this quantitative by establishing explicit convergence rates for the central limit theorem in an infinite-dimensional functional space, metrized with a natural transportation distance. We identify two regimes of interest; when the activation function is polynomial, its degree determines the rate of convergence, while for non-polynomial activations, the rate is governed by the smoothness of the function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2021

Large-width functional asymptotics for deep Gaussian neural networks

In this paper, we consider fully connected feed-forward deep neural netw...
research
06/29/2023

A Quantitative Functional Central Limit Theorem for Shallow Neural Networks

We prove a Quantitative Functional Central Limit Theorem for one-hidden-...
research
11/04/2021

Rate of Convergence of Polynomial Networks to Gaussian Processes

We examine one-hidden-layer neural networks with random weights. It is w...
research
04/08/2023

Non-asymptotic approximations of Gaussian neural networks via second-order Poincaré inequalities

There is a growing interest on large-width asymptotic properties of Gaus...
research
04/30/2018

Gaussian Process Behaviour in Wide Deep Neural Networks

Whilst deep neural networks have shown great empirical success, there is...
research
03/27/2019

Large Deviations of Bivariate Gaussian Extrema

We establish sharp tail asymptotics for component-wise extreme values of...
research
08/27/2019

Finite size corrections for neural network Gaussian processes

There has been a recent surge of interest in modeling neural networks (N...

Please sign up or login with your details

Forgot password? Click here to reset