Improved uncertainty quantification for neural networks with Bayesian last layer

02/21/2023
by   Felix Fiedler, et al.
0

Uncertainty quantification is an essential task in machine learning - a task in which neural networks (NNs) have traditionally not excelled. Bayesian neural networks (BNNs), in which parameters and predictions are probability distributions, can be a remedy for some applications, but often require expensive sampling for training and inference. NNs with Bayesian last layer (BLL) are simplified BNNs where only the weights in the last layer and the predictions follow a normal distribution. They are conceptually related to Bayesian linear regression (BLR) which has recently gained popularity in learning based-control under uncertainty. Both consider a non-linear feature space which is linearly mapped to the output, and hyperparameters, for example the noise variance, For NNs with BLL, these hyperparameters should include the deterministic weights of all other layers, as these impact the feature space and thus the predictive performance. Unfortunately, the marginal likelihood is expensive to evaluate in this setting and prohibits direct training through back-propagation. In this work, we present a reformulation of the BLL log-marginal likelihood, which considers weights in previous layers as hyperparameters and allows for efficient training through back-propagation. Furthermore, we derive a simple method to improve the extrapolation uncertainty of NNs with BLL. In a multivariate toy example and in the case of a dynamic system identification task, we show that NNs with BLL, trained with our proposed algorithm, outperform standard BLR with NN features.

READ FULL TEXT

page 1

page 5

page 7

research
08/26/2019

Marginally-calibrated deep distributional regression

Deep neural network (DNN) regression models are widely used in applicati...
research
07/23/2017

Learning uncertainty in regression tasks by deep neural networks

We suggest a general approach to quantification of different types of un...
research
02/07/2022

NUQ: Nonparametric Uncertainty Quantification for Deterministic Neural Networks

This paper proposes a fast and scalable method for uncertainty quantific...
research
11/25/2020

Bayesian Triplet Loss: Uncertainty Quantification in Image Retrieval

Uncertainty quantification in image retrieval is crucial for downstream ...
research
05/20/2015

Weight Uncertainty in Neural Networks

We introduce a new, efficient, principled and backpropagation-compatible...
research
02/19/2023

Imprecise Bayesian Neural Networks

Uncertainty quantification and robustness to distribution shifts are imp...
research
07/27/2021

Sparse Bayesian Deep Learning for Dynamic System Identification

This paper proposes a sparse Bayesian treatment of deep neural networks ...

Please sign up or login with your details

Forgot password? Click here to reset