Scaling up deep neural networks: a capacity allocation perspective

03/11/2019
by   Jonathan Donier, et al.
0

Following the recent work on capacity allocation, we formulate the conjecture that the shattering problem in deep neural networks can only be avoided if the capacity propagation through layers has a non-degenerate continuous limit when the number of layers tends to infinity. This allows us to study a number of commonly used architectures and determine which scaling relations should be enforced in practice as the number of layers grows large. In particular, we recover the conditions of Xavier initialization in the multi-channel case, and we find that weights and biases should be scaled down as the inverse square root of the number of layers for deep residual networks and as the inverse square root of the desired memory length for recurrent networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2019

Capacity allocation through neural network layers

Capacity analysis has been recently introduced as a way to analyze how l...
research
11/23/2021

Critical initialization of wide and deep neural networks through partial Jacobians: general theory and applications to LayerNorm

Deep neural networks are notorious for defying theoretical treatment. Ho...
research
10/03/2018

Understanding Weight Normalized Deep Neural Networks with Rectified Linear Units

This paper presents a general framework for norm-based capacity control ...
research
01/27/2021

CNN with large memory layers

This work is centred around the recently proposed product key memory str...
research
04/21/2021

Deep limits and cut-off phenomena for neural networks

We consider dynamical and geometrical aspects of deep learning. For many...
research
05/13/2022

Convergence Analysis of Deep Residual Networks

Various powerful deep neural network architectures have made great contr...
research
10/04/2022

Polysemanticity and Capacity in Neural Networks

Individual neurons in neural networks often represent a mixture of unrel...

Please sign up or login with your details

Forgot password? Click here to reset