GFlowOut: Dropout with Generative Flow Networks

10/24/2022
by   Dianbo Liu, et al.
7

Bayesian Inference offers principled tools to tackle many critical problems with modern neural networks such as poor calibration and generalization, and data inefficiency. However, scaling Bayesian inference to large architectures is challenging and requires restrictive approximations. Monte Carlo Dropout has been widely used as a relatively cheap way for approximate Inference and to estimate uncertainty with deep neural networks. Traditionally, the dropout mask is sampled independently from a fixed distribution. Recent works show that the dropout mask can be viewed as a latent variable, which can be inferred with variational inference. These methods face two important challenges: (a) the posterior distribution over masks can be highly multi-modal which can be difficult to approximate with standard variational inference and (b) it is not trivial to fully utilize sample-dependent information and correlation among dropout masks to improve posterior estimation. In this work, we propose GFlowOut to address these issues. GFlowOut leverages the recently proposed probabilistic framework of Generative Flow Networks (GFlowNets) to learn the posterior distribution over dropout masks. We empirically demonstrate that GFlowOut results in predictive distributions that generalize better to out-of-distribution data, and provide uncertainty estimates which lead to better performance in downstream tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2020

Calibration of Model Uncertainty for Dropout Variational Inference

The model uncertainty obtained by variational Bayesian inference with Mo...
research
02/16/2021

Improving Bayesian Inference in Deep Neural Networks with Variational Structured Dropout

Approximate inference in deep Bayesian networks exhibits a dilemma of ho...
research
07/23/2019

Bayesian inference for network Poisson models

This work is motivated by the analysis of ecological interaction network...
research
10/27/2022

Adapting Neural Models with Sequential Monte Carlo Dropout

The ability to adapt to changing environments and settings is essential ...
research
10/24/2020

Implicit Variational Inference: the Parameter and the Predictor Space

Having access to accurate confidence levels along with the predictions a...
research
10/08/2021

Is MC Dropout Bayesian?

MC Dropout is a mainstream "free lunch" method in medical imaging for ap...
research
12/24/2020

On Batch Normalisation for Approximate Bayesian Inference

We study batch normalisation in the context of variational inference met...

Please sign up or login with your details

Forgot password? Click here to reset