Mean field theory for deep dropout networks: digging up gradient backpropagation deeply

12/19/2019
by   Wei Huang, et al.
23

In recent years, the mean field theory has been applied to the study of neural networks and has achieved a great deal of success. The theory has been applied to various neural network structures, including CNNs, RNNs, Residual networks, and Batch normalization. Inevitably, recent work has also covered the use of dropout. The mean field theory shows that the existence of depth scales that limit the maximum depth of signal propagation and gradient backpropagation. However, the gradient backpropagation is derived under the gradient independence assumption that weights used during feed forward are drawn independently from the ones used in backpropagation. This is not how neural networks are trained in a real setting. Instead, the same weights used in a feed-forward step needs to be carried over to its corresponding backpropagation. Using this realistic condition, we perform theoretical computation on linear dropout networks and a series of experiments on dropout networks. Our empirical results show an interesting phenomenon that the length gradients can backpropagate for a single input and a pair of inputs are governed by the same depth scale. Besides, we study the relationship between variance and mean of statistical metrics of the gradient and shown an emergence of universality. Finally, we investigate the maximum trainable length for deep dropout networks through a series of experiments using MNIST and CIFAR10 and provide a more precise empirical formula that describes the trainable length than original work.

READ FULL TEXT
research
11/04/2016

Deep Information Propagation

We study the behavior of untrained neural networks whose weights and bia...
research
02/21/2019

A Mean Field Theory of Batch Normalization

We develop a mean field theory for batch normalization in fully-connecte...
research
09/27/2021

The edge of chaos: quantum field theory and deep neural networks

We explicitly construct the quantum field theory corresponding to a gene...
research
10/13/2019

If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks

Recent work in signal propagation theory has shown that dropout limits t...
research
02/01/2019

Signal propagation in continuous approximations of binary neural networks

The training of stochastic neural network models with binary (±1) weight...
research
11/17/2015

On the interplay of network structure and gradient convergence in deep learning

The regularization and output consistency behavior of dropout and layer-...
research
10/09/2018

Information Geometry of Orthogonal Initializations and Training

Recently mean field theory has been successfully used to analyze propert...

Please sign up or login with your details

Forgot password? Click here to reset