Deep limits and cut-off phenomena for neural networks

04/21/2021
by   Benny Avelin, et al.
0

We consider dynamical and geometrical aspects of deep learning. For many standard choices of layer maps we display semi-invariant metrics which quantify differences between data or decision functions. This allows us, when considering random layer maps and using non-commutative ergodic theorems, to deduce that certain limits exist when letting the number of layers tend to infinity. We also examine the random initialization of standard networks where we observe a surprising cut-off phenomenon in terms of the number of layers, the depth of the network. This could be a relevant parameter when choosing an appropriate number of layers for a given learning task, or for selecting a good initialization procedure. More generally, we hope that the notions and results in this paper can provide a framework, in particular a geometric one, for a part of the theoretical understanding of deep neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2023

Depth Degeneracy in Neural Networks: Vanishing Angles in Fully Connected ReLU Networks on Initialization

Stacking many layers to create truly deep neural networks is arguably wh...
research
07/01/2019

On Symmetry and Initialization for Neural Networks

This work provides an additional step in the theoretical understanding o...
research
11/23/2021

Critical initialization of wide and deep neural networks through partial Jacobians: general theory and applications to LayerNorm

Deep neural networks are notorious for defying theoretical treatment. Ho...
research
11/02/2021

Subquadratic Overparameterization for Shallow Neural Networks

Overparameterization refers to the important phenomenon where the width ...
research
03/12/2023

Phase Diagram of Initial Condensation for Two-layer Neural Networks

The phenomenon of distinct behaviors exhibited by neural networks under ...
research
03/11/2019

Scaling up deep neural networks: a capacity allocation perspective

Following the recent work on capacity allocation, we formulate the conje...
research
12/27/2019

Emergence of Network Motifs in Deep Neural Networks

Network science can offer fundamental insights into the structural and f...

Please sign up or login with your details

Forgot password? Click here to reset