Width and Depth Limits Commute in Residual Networks

02/01/2023
by   Soufiane Hayou, et al.
0

We show that taking the width and depth to infinity in a deep neural network with skip connections, when branches are scaled by 1/√(depth) (the only nontrivial scaling), result in the same covariance structure no matter how that limit is taken. This explains why the standard infinite-width-then-depth approach provides practical insights even for networks with depth of the same order as width. We also demonstrate that the pre-activations, in this case, have Gaussian distributions which has direct applications in Bayesian deep learning. We conduct extensive simulations that show an excellent match with our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2022

On the infinite-depth limit of finite-width neural networks

In this paper, we study the infinite-depth limit of finite-width residua...
research
07/13/2020

Probabilistic bounds on data sensitivity in deep rectifier networks

Neuron death is a complex phenomenon with implications for model trainab...
research
06/30/2023

The Shaped Transformer: Attention Models in the Infinite Depth-and-Width Limit

In deep learning theory, the covariance matrix of the representations se...
research
03/11/2021

Fast and Accurate Model Scaling

In this work we analyze strategies for convolutional neural network scal...
research
03/30/2020

Dataless Model Selection with the Deep Frame Potential

Choosing a deep neural network architecture is a fundamental problem in ...
research
02/06/2020

Duality of Width and Depth of Neural Networks

Here, we report that the depth and the width of a neural network are dua...
research
06/18/2021

The Principles of Deep Learning Theory

This book develops an effective theory approach to understanding deep ne...

Please sign up or login with your details

Forgot password? Click here to reset