DeepAI

# On the infinite-depth limit of finite-width neural networks

In this paper, we study the infinite-depth limit of finite-width residual neural networks with random Gaussian weights. With proper scaling, we show that by fixing the width and taking the depth to infinity, the vector of pre-activations converges in distribution to a zero-drift diffusion process. Unlike the infinite-width limit where the pre-activation converge weakly to a Gaussian random variable, we show that the infinite-depth limit yields different distributions depending on the choice of the activation function. We document two cases where these distributions have closed-form (different) expressions. We further show an intriguing phase-transition phenomenon of the post-activation norms when the width increases from 3 to 4. Lastly, we study the sequential limit infinite-depth-then-infinite-width, and show some key differences with the more commonly studied infinite-width-then-infinite-depth limit.

02/01/2023

### Width and Depth Limits Commute in Residual Networks

We show that taking the width and depth to infinity in a deep neural net...
06/06/2022

### The Neural Covariance SDE: Shaped Infinite Depth-and-Width Networks at Initialization

The logit outputs of a feedforward neural network at initialization are ...
06/18/2019

### Approximation power of random neural networks

This paper investigates the approximation power of three types of random...
05/17/2022

06/16/2022

### Neural tangent kernel analysis of shallow α-Stable ReLU neural networks

There is a recent literature on large-width properties of Gaussian neura...
06/11/2021

### Precise characterization of the prior predictive distribution of deep ReLU networks

Recent works on Bayesian neural networks (BNNs) have highlighted the nee...
01/11/2021

### Correlated Weights in Infinite Limits of Deep Convolutional Neural Networks

Infinite width limits of deep neural networks often have tractable forms...