How Good is the Bayes Posterior in Deep Neural Networks Really?

02/06/2020
by   Florian Wenzel, et al.
27

During the past five years the Bayesian deep learning community has developed increasingly accurate and efficient approximate inference procedures that allow for Bayesian inference in deep neural networks. However, despite this algorithmic progress and the promise of improved uncertainty quantification and sample efficiency there are—as of early 2020—no publicized deployments of Bayesian neural networks in industrial practice. In this work we cast doubt on the current understanding of Bayes posteriors in popular deep neural networks: we demonstrate through careful MCMC sampling that the posterior predictive induced by the Bayes posterior yields systematically worse predictions compared to simpler methods including point estimates obtained from SGD. Furthermore, we demonstrate that predictive performance is improved significantly through the use of a "cold posterior" that overcounts evidence. Such cold posteriors sharply deviate from the Bayesian paradigm but are commonly used as heuristic in Bayesian deep learning papers. We put forward several hypotheses that could explain cold posteriors and evaluate the hypotheses through experiments. Our work questions the goal of accurate posterior approximations in Bayesian deep learning: If the true Bayes posterior is poor, what is the use of more accurate approximations? Instead, we argue that it is timely to focus on understanding the origin of the improved performance of cold posteriors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2019

Quality of Uncertainty Quantification for Bayesian Neural Network Inference

Bayesian Neural Networks (BNNs) place priors over the parameters in a ne...
research
10/28/2020

Expressive yet Tractable Bayesian Deep Learning via Subnetwork Inference

The Bayesian paradigm has the potential to solve some of the core issues...
research
04/06/2023

Towards Efficient MCMC Sampling in Bayesian Neural Networks by Exploiting Symmetry

Bayesian inference in deep neural networks is challenging due to the hig...
research
08/13/2020

A statistical theory of cold posteriors in deep neural networks

To get Bayesian neural networks to perform comparably to standard neural...
research
06/16/2023

Collapsed Inference for Bayesian Deep Learning

Bayesian neural networks (BNNs) provide a formalism to quantify and cali...
research
05/24/2023

Masked Bayesian Neural Networks : Theoretical Guarantee and its Posterior Inference

Bayesian approaches for learning deep neural networks (BNN) have been re...
research
06/11/2021

Disentangling the Roles of Curation, Data-Augmentation and the Prior in the Cold Posterior Effect

The "cold posterior effect" (CPE) in Bayesian deep learning describes th...

Please sign up or login with your details

Forgot password? Click here to reset