Scaling Laws for the Principled Design, Initialization and Preconditioning of ReLU Networks

06/10/2019
by   Aaron Defazio, et al.
1

In this work, we describe a set of rules for the design and initialization of well-conditioned neural networks, guided by the goal of naturally balancing the diagonal blocks of the Hessian at the start of training. Our design principle balances multiple sensible measures of the conditioning of neural networks. We prove that for a ReLU-based deep multilayer perceptron, a simple initialization scheme using the geometric mean of the fan-in and fan-out satisfies our scaling rule. For more sophisticated architectures, we show how our scaling principle can be used to guide design choices to produce well-conditioned neural networks, reducing guess-work.

READ FULL TEXT
research
09/01/2021

A Weight Initialization Based on the Linear Product Structure for Neural Networks

Weight initialization plays an important role in training neural network...
research
04/04/2023

Effective Theory of Transformers at Initialization

We perform an effective-theory analysis of forward-backward signal propa...
research
06/20/2023

Principles for Initialization and Architecture Selection in Graph Neural Networks with ReLU Activations

This article derives and validates three principles for initialization a...
research
03/15/2019

Dying ReLU and Initialization: Theory and Numerical Examples

The dying ReLU refers to the problem when ReLU neurons become inactive a...
research
06/27/2022

AutoInit: Automatic Initialization via Jacobian Tuning

Good initialization is essential for training Deep Neural Networks (DNNs...
research
12/08/2018

Generalized Batch Normalization: Towards Accelerating Deep Neural Networks

Utilizing recently introduced concepts from statistics and quantitative ...
research
03/23/2021

Initializing ReLU networks in an expressive subspace of weights

Using a mean-field theory of signal propagation, we analyze the evolutio...

Please sign up or login with your details

Forgot password? Click here to reset