Uncertainty-based out-of-distribution detection requires suitable function space priors

10/12/2021
by   Francesco D'Angelo, et al.
0

The need to avoid confident predictions on unfamiliar data has sparked interest in out-of-distribution (OOD) detection. It is widely assumed that Bayesian neural networks (BNNs) are well suited for this task, as the endowed epistemic uncertainty should lead to disagreement in predictions on outliers. In this paper, we question this assumption and show that proper Bayesian inference with function space priors induced by neural networks does not necessarily lead to good OOD detection. To circumvent the use of approximate inference, we start by studying the infinite-width case, where Bayesian inference can be exact due to the correspondence with Gaussian processes. Strikingly, the kernels induced under common architectural choices lead to uncertainties that do not reflect the underlying data generating process and are therefore unsuited for OOD detection. Importantly, we find this OOD behavior to be consistent with the corresponding finite-width networks. Desirable function space properties can be encoded in the prior in weight space, however, this currently only applies to a specified subset of the domain and thus does not inherently extend to OOD data. Finally, we argue that a trade-off between generalization and OOD capabilities might render the application of BNNs for OOD detection undesirable in practice. Overall, our study discloses fundamental problems when naively using BNNs for OOD detection and opens interesting avenues for future research.

READ FULL TEXT

page 6

page 7

page 14

page 16

research
07/26/2021

Are Bayesian neural networks intrinsically good at out-of-distribution detection?

The need to avoid confident predictions on unfamiliar data has sparked i...
research
09/30/2019

Non-Gaussian processes and neural networks at finite widths

Gaussian processes are ubiquitous in nature and engineering. A case in p...
research
01/13/2022

An Overview of Uncertainty Quantification Methods for Infinite Neural Networks

To better understand the theoretical behavior of large neural networks, ...
research
10/06/2021

Bayesian neural network unit priors and generalized Weibull-tail property

The connection between Bayesian neural networks and Gaussian processes g...
research
05/14/2021

BNNpriors: A library for Bayesian neural network inference with different prior distributions

Bayesian neural networks have shown great promise in many applications w...
research
02/15/2020

Holes in Bayesian Statistics

Every philosophy has holes, and it is the responsibility of proponents o...
research
07/21/2022

Correcting Model Bias with Sparse Implicit Processes

Model selection in machine learning (ML) is a crucial part of the Bayesi...

Please sign up or login with your details

Forgot password? Click here to reset