Critical initialization of wide and deep neural networks through partial Jacobians: general theory and applications to LayerNorm

11/23/2021
by   Darshil Doshi, et al.
0

Deep neural networks are notorious for defying theoretical treatment. However, when the number of parameters in each layer tends to infinity the network function is a Gaussian process (GP) and quantitatively predictive description is possible. Gaussian approximation allows to formulate criteria for selecting hyperparameters, such as variances of weights and biases, as well as the learning rate. These criteria rely on the notion of criticality defined for deep neural networks. In this work we describe a new way to diagnose (both theoretically and empirically) this criticality. To that end, we introduce partial Jacobians of a network, defined as derivatives of preactivations in layer l with respect to preactivations in layer l_0<l. These quantities are particularly useful when the network architecture involves many different layers. We discuss various properties of the partial Jacobians such as their scaling with depth and relation to the neural tangent kernel (NTK). We derive the recurrence relations for the partial Jacobians and utilize them to analyze criticality of deep MLP networks with (and without) LayerNorm. We find that the normalization layer changes the optimal values of hyperparameters and critical exponents. We argue that LayerNorm is more stable when applied to preactivations, rather than activations due to larger correlation depth.

READ FULL TEXT

page 5

page 27

research
11/05/2018

How deep is deep enough? - Optimizing deep neural network architecture

Deep neural networks use stacked layers of feature detectors to repeated...
research
09/02/2022

Normalization effects on deep neural networks

We study the effect of normalization on the layers of deep neural networ...
research
03/11/2019

Scaling up deep neural networks: a capacity allocation perspective

Following the recent work on capacity allocation, we formulate the conje...
research
04/21/2021

Deep limits and cut-off phenomena for neural networks

We consider dynamical and geometrical aspects of deep learning. For many...
research
12/12/2019

On the relationship between multitask neural networks and multitask Gaussian Processes

Despite the effectiveness of multitask deep neural network (MTDNN), ther...
research
08/16/2018

Deep Convolutional Networks as shallow Gaussian Processes

We show that the output of a (residual) convolutional neural network (CN...
research
04/07/2018

Continuously Constructive Deep Neural Networks

Traditionally, deep learning algorithms update the network weights where...

Please sign up or login with your details

Forgot password? Click here to reset