Dropout as a Bayesian Approximation: Appendix

06/06/2015
by   Yarin Gal, et al.
0

We show that a neural network with arbitrary depth and non-linearities, with dropout applied before every weight layer, is mathematically equivalent to an approximation to a well known Bayesian model. This interpretation might offer an explanation to some of dropout's key properties, such as its robustness to over-fitting. Our interpretation allows us to reason about uncertainty in deep learning, and allows the introduction of the Bayesian machinery into existing deep learning frameworks in a principled way. This document is an appendix for the main paper "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning" by Gal and Ghahramani, 2015.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2015

Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

Deep learning tools have gained tremendous attention in applied machine ...
research
12/22/2014

A Bayesian encourages dropout

Dropout is one of the key techniques to prevent the learning from overfi...
research
12/05/2016

Known Unknowns: Uncertainty Quality in Bayesian Neural Networks

We evaluate the uncertainty quality in neural networks using anomaly det...
research
05/22/2017

Concrete Dropout

Dropout is used as a practical tool to obtain uncertainty estimates in l...
research
11/01/2021

Comparing Bayesian Models for Organ Contouring in Headand Neck Radiotherapy

Deep learning models for organ contouring in radiotherapy are poised for...
research
06/16/2023

Spatial-SpinDrop: Spatial Dropout-based Binary Bayesian Neural Network with Spintronics Implementation

Recently, machine learning systems have gained prominence in real-time, ...
research
06/26/2018

On the Implicit Bias of Dropout

Algorithmic approaches endow deep learning systems with implicit bias th...

Please sign up or login with your details

Forgot password? Click here to reset