DeepAI AI Chat
Log In Sign Up

Infinite-channel deep stable convolutional neural networks

02/07/2021
by   Daniele Bracale, et al.
0

The interplay between infinite-width neural networks (NNs) and classes of Gaussian processes (GPs) is well known since the seminal work of Neal (1996). While numerous theoretical refinements have been proposed in the recent years, the interplay between NNs and GPs relies on two critical distributional assumptions on the NN's parameters: A1) finite variance; A2) independent and identical distribution (iid). In this paper, we consider the problem of removing A1 in the general context of deep feed-forward convolutional NNs. In particular, we assume iid parameters distributed according to a stable distribution and we show that the infinite-channel limit of a deep feed-forward convolutional NNs, under suitable scaling, is a stochastic process with multivariate stable finite-dimensional distributions. Such a limiting distribution is then characterized through an explicit backward recursion for its parameters over the layers. Our contribution extends results of Favaro et al. (2020) to convolutional architectures, and it paves the way to expand exciting recent lines of research that rely on classes of GP limits.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/01/2020

Stable behaviour of infinitely wide deep neural networks

We consider fully connected feed-forward deep neural networks (NNs) wher...
08/02/2021

Deep Stable neural networks: large-width asymptotics and convergence rates

In modern deep learning, there is a recent and growing literature on the...
06/18/2021

α-Stable convergence of heavy-tailed infinitely-wide neural networks

We consider infinitely-wide multi-layer perceptrons (MLPs) which are lim...
06/16/2022

Neural tangent kernel analysis of shallow α-Stable ReLU neural networks

There is a recent literature on large-width properties of Gaussian neura...
10/11/2018

Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes

There is a previously identified equivalence between wide fully connecte...
04/02/2020

Predicting the outputs of finite networks trained with noisy gradients

A recent line of studies has focused on the infinite width limit of deep...
07/07/2020

Doubly infinite residual networks: a diffusion process approach

When neural network's parameters are initialized as i.i.d., neural netwo...