Density Regression and Uncertainty Quantification with Bayesian Deep Noise Neural Networks

06/12/2022
by   Daiwei Zhang, et al.
0

Deep neural network (DNN) models have achieved state-of-the-art predictive accuracy in a wide range of supervised learning applications. However, accurately quantifying the uncertainty in DNN predictions remains a challenging task. For continuous outcome variables, an even more difficult problem is to estimate the predictive density function, which not only provides a natural quantification of the predictive uncertainty, but also fully captures the random variation in the outcome. In this work, we propose the Bayesian Deep Noise Neural Network (B-DeepNoise), which generalizes standard Bayesian DNNs by extending the random noise variable from the output layer to all hidden layers. The latent random noise equips B-DeepNoise with the flexibility to approximate highly complex predictive distributions and accurately quantify predictive uncertainty. For posterior computation, the unique structure of B-DeepNoise leads to a closed-form Gibbs sampling algorithm that iteratively simulates from the posterior full conditional distributions of the model parameters, circumventing computationally intensive Metropolis-Hastings methods. A theoretical analysis of B-DeepNoise establishes a recursive representation of the predictive distribution and decomposes the predictive variance with respect to the latent parameters. We evaluate B-DeepNoise against existing methods on benchmark regression datasets, demonstrating its superior performance in terms of prediction accuracy, uncertainty quantification accuracy, and uncertainty quantification efficiency. To illustrate our method's usefulness in scientific studies, we apply B-DeepNoise to predict general intelligence from neuroimaging features in the Adolescent Brain Cognitive Development (ABCD) project.

READ FULL TEXT
research
08/26/2019

Marginally-calibrated deep distributional regression

Deep neural network (DNN) regression models are widely used in applicati...
research
09/09/2022

Gluformer: Transformer-Based Personalized Glucose Forecasting with Uncertainty Quantification

Deep learning models achieve state-of-the art results in predicting bloo...
research
09/16/2021

Improving Regression Uncertainty Estimation Under Statistical Change

While deep neural networks are highly performant and successful in a wid...
research
12/16/2022

Easy Uncertainty Quantification (EasyUQ): Generating predictive distributions from single-valued model output

How can we quantify uncertainty if our favorite computational tool - be ...
research
06/02/2022

Masked Bayesian Neural Networks : Computation and Optimality

As data size and computing power increase, the architectures of deep neu...
research
02/07/2023

IB-UQ: Information bottleneck based uncertainty quantification for neural function regression and neural operator learning

In this paper, a novel framework is established for uncertainty quantifi...
research
10/16/2022

Posterior Regularized Bayesian Neural Network Incorporating Soft and Hard Knowledge Constraints

Neural Networks (NNs) have been widely used in supervised learning due t...

Please sign up or login with your details

Forgot password? Click here to reset