On the expected behaviour of noise regularised deep neural networks as Gaussian processes

10/12/2019
by   Arnu Pretorius, et al.
22

Recent work has established the equivalence between deep neural networks and Gaussian processes (GPs), resulting in so-called neural network Gaussian processes (NNGPs). The behaviour of these models depends on the initialisation of the corresponding network. In this work, we consider the impact of noise regularisation (e.g. dropout) on NNGPs, and relate their behaviour to signal propagation theory in noise regularised deep neural networks. For ReLU activations, we find that the best performing NNGPs have kernel parameters that correspond to a recently proposed initialisation scheme for noise regularised ReLU networks. In addition, we show how the noise influences the covariance matrix of the NNGP, producing a stronger prior towards simple functions away from the training points. We verify our theoretical findings with experiments on MNIST and CIFAR-10 as well as on synthetic data.

READ FULL TEXT

page 2

page 6

page 8

research
11/01/2018

Critical initialisation for deep signal propagation in noisy rectifier neural networks

Stochastic regularisation is an important weapon in the arsenal of a dee...
research
05/10/2021

Deep Neural Networks as Point Estimates for Deep Gaussian Processes

Deep Gaussian processes (DGPs) have struggled for relevance in applicati...
research
07/10/2020

Characteristics of Monte Carlo Dropout in Wide Neural Networks

Monte Carlo (MC) dropout is one of the state-of-the-art approaches for u...
research
09/17/2022

Interrelation of equivariant Gaussian processes and convolutional neural networks

Currently there exists rather promising new trend in machine leaning (ML...
research
06/18/2020

Infinite attention: NNGP and NTK for deep attention networks

There is a growing amount of literature on the relationship between wide...
research
02/24/2014

Avoiding pathologies in very deep networks

Choosing appropriate architectures and regularization strategies for dee...
research
11/02/2020

Learning in the Wild with Incremental Skeptical Gaussian Processes

The ability to learn from human supervision is fundamental for personal ...

Please sign up or login with your details

Forgot password? Click here to reset