Easy Uncertainty Quantification (EasyUQ): Generating predictive distributions from single-valued model output

12/16/2022
by   Eva-Maria Walz, et al.
0

How can we quantify uncertainty if our favorite computational tool - be it a numerical, a statistical, or a machine learning approach, or just any computer model - provides single-valued output only? In this article, we introduce the Easy Uncertainty Quantification (EasyUQ) technique, which transforms real-valued model output into calibrated statistical distributions, based solely on training data of model output-outcome pairs, without any need to access model input. In its basic form, EasyUQ is a special case of the recently introduced Isotonic Distributional Regression (IDR) technique that leverages the pool-adjacent-violators algorithm for nonparametric isotonic regression. EasyUQ yields discrete predictive distributions that are calibrated and optimal in finite samples, subject to stochastic monotonicity. The workflow is fully automated, without any need for tuning. The Smooth EasyUQ approach supplements IDR with kernel smoothing, to yield continuous predictive distributions that preserve key properties of the basic form, including both, stochastic monotonicity with respect to the original model output, and asymptotic consistency. For the selection of kernel parameters, we introduce multiple one-fit grid search, a computationally much less demanding approximation to leave-one-out cross-validation. We use simulation examples and the WeatherBench challenge in data-driven weather prediction to illustrate the techniques. In a study of benchmark problems from machine learning, we show how EasyUQ and Smooth EasyUQ can be integrated into the workflow of modern neural network learning and hyperparameter tuning, and find EasyUQ to be competitive with more elaborate input-based approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2022

Density Regression and Uncertainty Quantification with Bayesian Deep Noise Neural Networks

Deep neural network (DNN) models have achieved state-of-the-art predicti...
research
09/17/2022

Sample-based Uncertainty Quantification with a Single Deterministic Neural Network

Development of an accurate, flexible, and numerically efficient uncertai...
research
03/31/2021

Universal Prediction Band via Semi-Definite Programming

We propose a computationally efficient method to construct nonparametric...
research
09/09/2019

Isotonic Distributional Regression

Isotonic distributional regression (IDR) is a powerful nonparametric tec...
research
10/21/2022

Calibration tests beyond classification

Most supervised machine learning tasks are subject to irreducible predic...
research
09/04/2019

Regression-based sparse polynomial chaos for uncertainty quantification of subsurface flow models

Surrogate-modelling techniques including Polynomial Chaos Expansion (PCE...
research
08/09/2018

Data-driven polynomial chaos expansion for machine learning regression

We present a regression technique for data driven problems based on poly...

Please sign up or login with your details

Forgot password? Click here to reset