-
Calibrated Reliable Regression using Maximum Mean Discrepancy
Accurate quantification of uncertainty is crucial for real-world applica...
read it
-
Heteroscedastic Calibration of Uncertainty Estimators in Deep Learning
The role of uncertainty quantification (UQ) in deep learning has become ...
read it
-
Exploring the Uncertainty Properties of Neural Networks' Implicit Priors in the Infinite-Width Limit
Modern deep learning models have achieved great success in predictive ac...
read it
-
Learning uncertainty in regression tasks by deep neural networks
We suggest a general approach to quantification of different types of un...
read it
-
Quantile Regularization: Towards Implicit Calibration of Regression Models
Recent works have shown that most deep learning models are often poorly ...
read it
-
Bayesian Triplet Loss: Uncertainty Quantification in Image Retrieval
Uncertainty quantification in image retrieval is crucial for downstream ...
read it
-
Application of Deep Learning-based Interpolation Methods to Nearshore Bathymetry
Nearshore bathymetry, the topography of the ocean floor in coastal zones...
read it
Marginally-calibrated deep distributional regression
Deep neural network (DNN) regression models are widely used in applications requiring state-of-the-art predictive accuracy. However, until recently there has been little work on accurate uncertainty quantification for predictions from such models. We add to this literature by outlining an approach to constructing predictive distributions that are `marginally calibrated'. This is where the long run average of the predictive distributions of the response variable matches the observed empirical margin. Our approach considers a DNN regression with a conditionally Gaussian prior for the final layer weights, from which an implicit copula process on the feature space is extracted. This copula process is combined with a non-parametrically estimated marginal distribution for the response. The end result is a scalable distributional DNN regression method with marginally calibrated predictions, and our work complements existing methods for probability calibration. The approach is first illustrated using two applications of dense layer feed-forward neural networks. However, our main motivating applications are in likelihood-free inference, where distributional deep regression is used to estimate marginal posterior distributions. In two complex ecological time series examples we employ the implicit copulas of convolutional networks, and show that marginal calibration results in improved uncertainty quantification. Our approach also avoids the need for manual specification of summary statistics, a requirement that is burdensome for users and typical of competing likelihood-free inference methods.
READ FULL TEXT
Comments
There are no comments yet.