DeepAI AI Chat
Log In Sign Up

Imprecise Bayesian Neural Networks

by   Michele Caprio, et al.
University of Pennsylvania
Rensselaer Polytechnic Institute

Uncertainty quantification and robustness to distribution shifts are important goals in machine learning and artificial intelligence. Although Bayesian neural networks (BNNs) allow for uncertainty in the predictions to be assessed, different sources of uncertainty are indistinguishable. We present imprecise Bayesian neural networks (IBNNs); they generalize and overcome some of the drawbacks of standard BNNs. These latter are trained using a single prior and likelihood distributions, whereas IBNNs are trained using credal prior and likelihood sets. They allow to distinguish between aleatoric and epistemic uncertainties, and to quantify them. In addition, IBNNs are robust in the sense of Bayesian sensitivity analysis, and are more robust than BNNs to distribution shift. They can also be used to compute sets of outcomes that enjoy PAC-like properties. We apply IBNNs to two case studies. One, to model blood glucose and insulin dynamics for artificial pancreas control, and two, for motion prediction in autonomous driving scenarios. We show that IBNNs performs better when compared to an ensemble of BNNs benchmark.


page 1

page 2

page 3

page 4


Ensemble-based Uncertainty Quantification: Bayesian versus Credal Inference

The idea to distinguish and quantify two important types of uncertainty,...

An out-of-distribution discriminator based on Bayesian neural network epistemic uncertainty

Neural networks have revolutionized the field of machine learning with i...

Model Architecture Adaption for Bayesian Neural Networks

Bayesian Neural Networks (BNNs) offer a mathematically grounded framewor...

A General Framework for quantifying Aleatoric and Epistemic uncertainty in Graph Neural Networks

Graph Neural Networks (GNN) provide a powerful framework that elegantly ...

Improved uncertainty quantification for neural networks with Bayesian last layer

Uncertainty quantification is an essential task in machine learning - a ...

Being a Bit Frequentist Improves Bayesian Neural Networks

Despite their compelling theoretical properties, Bayesian neural network...

Shifts 2.0: Extending The Dataset of Real Distributional Shifts

Distributional shift, or the mismatch between training and deployment da...