Characteristics of Monte Carlo Dropout in Wide Neural Networks

07/10/2020
by   Joachim Sicking, et al.
0

Monte Carlo (MC) dropout is one of the state-of-the-art approaches for uncertainty estimation in neural networks (NNs). It has been interpreted as approximately performing Bayesian inference. Based on previous work on the approximation of Gaussian processes by wide and deep neural networks with random weights, we study the limiting distribution of wide untrained NNs under dropout more rigorously and prove that they as well converge to Gaussian processes for fixed sets of weights and biases. We sketch an argument that this property might also hold for infinitely wide feed-forward networks that are trained with (full-batch) gradient descent. The theory is contrasted by an empirical analysis in which we find correlations and non-Gaussian behaviour for the pre-activations of finite width NNs. We therefore investigate how (strongly) correlated pre-activations can induce non-Gaussian behavior in NNs with strongly correlated weights.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2020

Qualitative Analysis of Monte Carlo Dropout

In this report, we present qualitative analysis of Monte Carlo (MC) drop...
research
09/02/2019

Pathologies of Factorised Gaussian and MC Dropout Posteriors in Bayesian Neural Networks

Neural networks provide state-of-the-art performance on a variety of tas...
research
03/01/2020

Stable behaviour of infinitely wide deep neural networks

We consider fully connected feed-forward deep neural networks (NNs) wher...
research
10/12/2019

On the expected behaviour of noise regularised deep neural networks as Gaussian processes

Recent work has established the equivalence between deep neural networks...
research
07/03/2021

Scale Mixtures of Neural Network Gaussian Processes

Recent works have revealed that infinitely-wide feed-forward or recurren...
research
08/02/2021

Deep Stable neural networks: large-width asymptotics and convergence rates

In modern deep learning, there is a recent and growing literature on the...
research
12/10/2018

Bayesian Layers: A Module for Neural Network Uncertainty

We describe Bayesian Layers, a module designed for fast experimentation ...

Please sign up or login with your details

Forgot password? Click here to reset