Variational Bayesian dropout: pitfalls and fixes

07/05/2018
by   Jiri Hron, et al.
0

Dropout, a stochastic regularisation technique for training of neural networks, has recently been reinterpreted as a specific type of approximate inference algorithm for Bayesian neural networks. The main contribution of the reinterpretation is in providing a theoretical framework useful for analysing and extending the algorithm. We show that the proposed framework suffers from several issues; from undefined or pathological behaviour of the true posterior related to use of improper priors, to an ill-defined variational objective due to singularity of the approximating distribution relative to the true posterior. Our analysis of the improper log uniform prior used in variational Gaussian dropout suggests the pathologies are generally irredeemable, and that the algorithm still works only because the variational formulation annuls some of the pathologies. To address the singularity issue, we proffer Quasi-KL (QKL) divergence, a new approximate inference objective for approximation of high-dimensional distributions. We show that motivations for variational Bernoulli dropout based on discretisation and noise have QKL as a limit. Properties of QKL are studied both theoretically and on a simple practical example which shows that the QKL-optimal approximation of a full rank Gaussian with a degenerate one naturally leads to the Principal Component Analysis solution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2018

Variational Dropout via Empirical Bayes

We study the Automatic Relevance Determination procedure applied to deep...
research
11/08/2017

Variational Gaussian Dropout is not Bayesian

Gaussian multiplicative noise is commonly used as a stochastic regularis...
research
11/19/2018

Variational Bayesian Dropout

Variational dropout (VD) is a generalization of Gaussian dropout, which ...
research
08/12/2015

Bayesian Dropout

Dropout has recently emerged as a powerful and simple method for trainin...
research
10/08/2021

Is MC Dropout Bayesian?

MC Dropout is a mainstream "free lunch" method in medical imaging for ap...
research
05/10/2018

Loss-Calibrated Approximate Inference in Bayesian Neural Networks

Current approaches in approximate inference for Bayesian neural networks...
research
04/27/2022

Dropout Inference with Non-Uniform Weight Scaling

Dropout as regularization has been used extensively to prevent overfitti...

Please sign up or login with your details

Forgot password? Click here to reset