Approximation power of random neural networks

06/18/2019
by   Bolton Bailey, et al.
0

This paper investigates the approximation power of three types of random neural networks: (a) infinite width networks, with weights following an arbitrary distribution; (b) finite width networks obtained by subsampling the preceding infinite width networks; (c) finite width networks obtained by starting with standard Gaussian initialization, and then adding a vanishingly small correction to the weights. The primary result is a fully quantified bound on the rate of approximation of general general continuous functions: in all three cases, a function f can be approximated with complexity f_1 (d/δ)^O(d), where δ depends on continuity properties of f and the complexity measure depends on the weight magnitudes and/or cardinalities. Along the way, a variety of ancillary results are developed: an exact construction of Gaussian densities with infinite width networks, an elementary stand-alone proof scheme for approximation via convolutions of radial basis functions, subsampling rates for infinite width networks, and depth separation for corrected networks.

READ FULL TEXT
research
10/03/2022

On the infinite-depth limit of finite-width neural networks

In this paper, we study the infinite-depth limit of finite-width residua...
research
11/16/2022

An Empirical Analysis of the Advantages of Finite- v.s. Infinite-Width Bayesian Neural Networks

Comparing Bayesian neural networks (BNNs) with different widths is chall...
research
07/31/2021

The Separation Capacity of Random Neural Networks

Neural networks with random weights appear in a variety of machine learn...
research
06/30/2022

A note on Linear Bottleneck networks and their Transition to Multilinearity

Randomly initialized wide neural networks transition to linear functions...
research
08/19/2020

Asymptotics of Wide Convolutional Neural Networks

Wide neural networks have proven to be a rich class of architectures for...
research
06/11/2021

Precise characterization of the prior predictive distribution of deep ReLU networks

Recent works on Bayesian neural networks (BNNs) have highlighted the nee...
research
12/10/2021

Eigenspace Restructuring: a Principle of Space and Frequency in Neural Networks

Understanding the fundamental principles behind the massive success of n...

Please sign up or login with your details

Forgot password? Click here to reset