Bayesian Uncertainty Estimation for Batch Normalized Deep Networks

02/18/2018
by   Mattias Teye, et al.
0

Deep neural networks have led to a series of breakthroughs, dramatically improving the state-of-the-art in many domains. The techniques driving these advances, however, lack a formal method to account for model uncertainty. While the Bayesian approach to learning provides a solid theoretical framework to handle uncertainty, inference in Bayesian-inspired deep neural networks is difficult. In this paper, we provide a practical approach to Bayesian learning that relies on a regularization technique found in nearly every modern network, batch normalization. We show that training a deep network using batch normalization is equivalent to approximate inference in Bayesian models, and we demonstrate how this finding allows us to make useful estimates of the model uncertainty. With our approach, it is possible to make meaningful uncertainty estimates using conventional architectures without modifying the network or the training procedure. Our approach is thoroughly validated in a series of empirical experiments on different tasks and using various measures, outperforming baselines with strong statistical significance and displaying competitive performance with other recent Bayesian approaches.

READ FULL TEXT

page 8

page 21

page 22

page 23

research
12/03/2019

On the Validity of Bayesian Neural Networks for Uncertainty Estimation

Deep neural networks (DNN) are versatile parametric models utilised succ...
research
11/01/2018

Stochastic Normalizations as Bayesian Learning

In this work we investigate the reasons why Batch Normalization (BN) imp...
research
06/07/2019

DropConnect Is Effective in Modeling Uncertainty of Bayesian Deep Networks

Deep neural networks (DNNs) have achieved state-of-the-art performances ...
research
07/12/2023

A Bayesian approach to quantifying uncertainties and improving generalizability in traffic prediction models

Deep-learning models for traffic data prediction can have superior perfo...
research
02/10/2018

Deep learning with t-exponential Bayesian kitchen sinks

Bayesian learning has been recently considered as an effective means of ...
research
03/09/2020

An Empirical Evaluation on Robustness and Uncertainty of Regularization Methods

Despite apparent human-level performances of deep neural networks (DNN),...
research
03/28/2022

To Fold or Not to Fold: a Necessary and Sufficient Condition on Batch-Normalization Layers Folding

Batch-Normalization (BN) layers have become fundamental components in th...

Please sign up or login with your details

Forgot password? Click here to reset