Marginally-calibrated deep distributional regression

08/26/2019
by   Nadja Klein, et al.
21

Deep neural network (DNN) regression models are widely used in applications requiring state-of-the-art predictive accuracy. However, until recently there has been little work on accurate uncertainty quantification for predictions from such models. We add to this literature by outlining an approach to constructing predictive distributions that are `marginally calibrated'. This is where the long run average of the predictive distributions of the response variable matches the observed empirical margin. Our approach considers a DNN regression with a conditionally Gaussian prior for the final layer weights, from which an implicit copula process on the feature space is extracted. This copula process is combined with a non-parametrically estimated marginal distribution for the response. The end result is a scalable distributional DNN regression method with marginally calibrated predictions, and our work complements existing methods for probability calibration. The approach is first illustrated using two applications of dense layer feed-forward neural networks. However, our main motivating applications are in likelihood-free inference, where distributional deep regression is used to estimate marginal posterior distributions. In two complex ecological time series examples we employ the implicit copulas of convolutional networks, and show that marginal calibration results in improved uncertainty quantification. Our approach also avoids the need for manual specification of summary statistics, a requirement that is burdensome for users and typical of competing likelihood-free inference methods.

READ FULL TEXT

page 34

page 35

research
06/12/2022

Density Regression and Uncertainty Quantification with Bayesian Deep Noise Neural Networks

Deep neural network (DNN) models have achieved state-of-the-art predicti...
research
02/21/2023

Improved uncertainty quantification for neural networks with Bayesian last layer

Uncertainty quantification is an essential task in machine learning - a ...
research
05/29/2022

Calibrated Predictive Distributions via Diagnostics for Conditional Coverage

Uncertainty quantification is crucial for assessing the predictive abili...
research
06/18/2020

Calibrated Reliable Regression using Maximum Mean Discrepancy

Accurate quantification of uncertainty is crucial for real-world applica...
research
10/03/2021

Marginally calibrated response distributions for end-to-end learning in autonomous driving

End-to-end learners for autonomous driving are deep neural networks that...
research
12/02/2021

RafterNet: Probabilistic predictions in multi-response regression

A fully nonparametric approach for making probabilistic predictions in m...
research
12/14/2022

Uncertainty Quantification for Deep Neural Networks: An Empirical Comparison and Usage Guidelines

Deep Neural Networks (DNN) are increasingly used as components of larger...

Please sign up or login with your details

Forgot password? Click here to reset