Critical initialisation for deep signal propagation in noisy rectifier neural networks

11/01/2018
by   Arnu Pretorius, et al.
6

Stochastic regularisation is an important weapon in the arsenal of a deep learning practitioner. However, despite recent theoretical advances, our understanding of how noise influences signal propagation in deep neural networks remains limited. By extending recent work based on mean field theory, we develop a new framework for signal propagation in stochastic regularised neural networks. Our noisy signal propagation theory can incorporate several common noise distributions, including additive and multiplicative Gaussian noise as well as dropout. We use this framework to investigate initialisation strategies for noisy ReLU networks. We show that no critical initialisation strategy exists using additive noise, with signal propagation exploding regardless of the selected noise distribution. For multiplicative noise (e.g. dropout), we identify alternative critical initialisation strategies that depend on the second moment of the noise distribution. Simulations and experiments on real-world data confirm that our proposed initialisation is able to stably propagate signals in deep networks, while using an initialisation disregarding noise fails to do so. Furthermore, we analyse correlation dynamics between inputs. Stronger noise regularisation is shown to reduce the depth to which discriminatory information about the inputs to a noisy ReLU network is able to propagate, even when initialised at criticality. We support our theoretical predictions for these trainable depths with simulations, as well as with experiments on MNIST and CIFAR-10

READ FULL TEXT

page 8

page 20

research
10/12/2019

On the expected behaviour of noise regularised deep neural networks as Gaussian processes

Recent work has established the equivalence between deep neural networks...
research
10/13/2019

If dropout limits trainable depth, does critical initialisation still matter? A large-scale statistical analysis on ReLU networks

Recent work in signal propagation theory has shown that dropout limits t...
research
06/14/2018

Learning Dynamics of Linear Denoising Autoencoders

Denoising autoencoders (DAEs) have proven useful for unsupervised repres...
research
12/04/2022

Statistical Physics of Deep Neural Networks: Initialization toward Optimal Channels

In deep learning, neural networks serve as noisy channels between input ...
research
09/19/2018

Removing the Feature Correlation Effect of Multiplicative Noise

Multiplicative noise, including dropout, is widely used to regularize de...
research
11/04/2016

Deep Information Propagation

We study the behavior of untrained neural networks whose weights and bia...
research
07/21/2019

Fundamental aspects of noise in analog-hardware neural networks

We study and analyze the fundamental aspects of noise propagation in rec...

Please sign up or login with your details

Forgot password? Click here to reset