A Quantitative Functional Central Limit Theorem for Shallow Neural Networks

06/29/2023
by   Valentina Cammarota, et al.
0

We prove a Quantitative Functional Central Limit Theorem for one-hidden-layer neural networks with generic activation function. The rates of convergence that we establish depend heavily on the smoothness of the activation function, and they range from logarithmic in non-differentiable cases such as the Relu to √(n) for very regular activations. Our main tools are functional versions of the Stein-Malliavin approach; in particular, we exploit heavily a quantitative functional central limit theorem which has been recently established by Bourguin and Campese (2020).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2021

Non-asymptotic approximations of neural networks by Gaussian processes

We study the extent to which wide neural networks may be approximated by...
research
04/10/2023

Approximation of Nonlinear Functionals Using Deep ReLU Networks

In recent years, functional neural networks have been proposed and studi...
research
09/28/2021

A functional central limit theorem for the empirical Ripley's K-function

We establish a functional central limit theorem for Ripley's K-function ...
research
07/26/2022

One Simple Trick to Fix Your Bayesian Neural Network

One of the most popular estimation methods in Bayesian neural networks (...
research
06/29/2022

From Kernel Methods to Neural Networks: A Unifying Variational Formulation

The minimization of a data-fidelity term and an additive regularization ...
research
07/12/2019

Asymptotics for Spherical Functional Autoregressions

In this paper, we investigate a class of spherical functional autoregres...
research
11/24/2021

The Practical Scope of the Central Limit Theorem

The Central Limit Theorem (CLT) is at the heart of a great deal of appli...

Please sign up or login with your details

Forgot password? Click here to reset