α-Stable convergence of heavy-tailed infinitely-wide neural networks

06/18/2021
by   Paul Jung, et al.
5

We consider infinitely-wide multi-layer perceptrons (MLPs) which are limits of standard deep feed-forward neural networks. We assume that, for each layer, the weights of an MLP are initialized with i.i.d. samples from either a light-tailed (finite variance) or heavy-tailed distribution in the domain of attraction of a symmetric α-stable distribution, where α∈(0,2] may depend on the layer. For the bias terms of the layer, we assume i.i.d. initializations with a symmetric α-stable distribution having the same α parameter of that layer. We then extend a recent result of Favaro, Fortini, and Peluchetti (2020), to show that the vector of pre-activation values at all nodes of a given hidden layer converges in the limit, under a suitable scaling, to a vector of i.i.d. random variables with symmetric α-stable distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2020

Stable behaviour of infinitely wide deep neural networks

We consider fully connected feed-forward deep neural networks (NNs) wher...
research
02/07/2021

Infinite-channel deep stable convolutional neural networks

The interplay between infinite-width neural networks (NNs) and classes o...
research
04/08/2023

Infinitely wide limits for deep Stable neural networks: sub-linear, linear and super-linear activation functions

There is a growing literature on the study of large-width properties of ...
research
06/27/2021

On Graphical Models and Convex Geometry

We introduce a mixture-model of beta distributions to identify significa...
research
04/16/2014

Stable Graphical Models

Stable random variables are motivated by the central limit theorem for d...
research
12/27/2022

Estimation of stability index for symmetric α-stable distribution using quantile conditional variance ratios

The class of α-stable distributions is widely used in various applicatio...

Please sign up or login with your details

Forgot password? Click here to reset