Depth Degeneracy in Neural Networks: Vanishing Angles in Fully Connected ReLU Networks on Initialization

02/20/2023
by   Cameron Jakub, et al.
0

Stacking many layers to create truly deep neural networks is arguably what has led to the recent explosion of these methods. However, many properties of deep neural networks are not yet understood. One such mystery is the depth degeneracy phenomenon: the deeper you make your network, the closer your network is to a constant function on initialization. In this paper, we examine the evolution of the angle between two inputs to a ReLU neural network as a function of the number of layers. By using combinatorial expansions, we find precise formulas for how fast this angle goes to zero as depth increases. Our formulas capture microscopic fluctuations that are not visible in the popular framework of infinite width limits, and yet have a significant effect on predicted behaviour. The formulas are given in terms of the mixed moments of correlated Gaussians passed through the ReLU function. We also find a surprising combinatorial connection between these mixed moments and the Bessel numbers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2023

Network Degeneracy as an Indicator of Training Performance: Comparing Finite and Infinite Width Angle Predictions

Neural networks are powerful functions with widespread use, but the theo...
research
04/21/2021

Deep limits and cut-off phenomena for neural networks

We consider dynamical and geometrical aspects of deep learning. For many...
research
06/07/2021

The Future is Log-Gaussian: ResNets and Their Infinite-Depth-and-Width Limit at Initialization

Theoretical results show that neural networks can be approximated by Gau...
research
06/11/2021

Precise characterization of the prior predictive distribution of deep ReLU networks

Recent works on Bayesian neural networks (BNNs) have highlighted the nee...
research
11/07/2018

Characterizing Well-behaved vs. Pathological Deep Neural Network Architectures

We introduce a principled approach, requiring only mild assumptions, for...
research
11/03/2021

A Johnson–Lindenstrauss Framework for Randomly Initialized CNNs

How does the geometric representation of a dataset change after the appl...
research
07/15/2022

Algorithmic Determination of the Combinatorial Structure of the Linear Regions of ReLU Neural Networks

We algorithmically determine the regions and facets of all dimensions of...

Please sign up or login with your details

Forgot password? Click here to reset