Multifidelity probability estimation via fusion of estimators

05/07/2019
by   Boris Kramer, et al.
0

This paper develops a multifidelity method that enables estimation of failure probabilities for expensive-to-evaluate models via information fusion and importance sampling. The presented general fusion method combines multiple probability estimators with the goal of variance reduction. We use low-fidelity models to derive biasing densities for importance sampling and then fuse the importance sampling estimators such that the fused multifidelity estimator is unbiased and has mean-squared error lower than or equal to that of any of the importance sampling estimators alone. By fusing all available estimators, the method circumvents the challenging problem of selecting the best biasing density and using only that density for sampling. A rigorous analysis shows that the fused estimator is optimal in the sense that it has minimal variance amongst all possible combinations of the estimators. The asymptotic behavior of the proposed method is demonstrated on a convection-diffusion-reaction partial differential equation model for which 10^5 samples can be afforded. To illustrate the proposed method at scale, we consider a model of a free plane jet and quantify how uncertainties at the flow inlet propagate to a quantity of interest related to turbulent mixing. Compared to an importance sampling estimator that uses the high-fidelity model alone, our multifidelity estimator reduces the required CPU time by 65% while achieving a similar coefficient of variation.

READ FULL TEXT

page 20

page 24

research
09/13/2021

State Relevance for Off-Policy Evaluation

Importance sampling-based estimators for off-policy evaluation (OPE) are...
research
01/07/2021

Ensemble approximate control variate estimators: Applications to multi-fidelity importance sampling

The recent growth in multi-fidelity uncertainty quantification has given...
research
04/03/2017

A comparative study of counterfactual estimators

We provide a comparative study of several widely used off-policy estimat...
research
01/23/2023

Optimizing the Noise in Self-Supervised Learning: from Importance Sampling to Noise-Contrastive Estimation

Self-supervised learning is an increasingly popular approach to unsuperv...
research
07/09/2018

External Patch-Based Image Restoration Using Importance Sampling

This paper introduces a new approach to patch-based image restoration ba...
research
07/05/2016

Efficient Estimation in the Tails of Gaussian Copulas

We consider the question of efficient estimation in the tails of Gaussia...
research
11/13/2017

Incremental Mixture Importance Sampling with Shotgun optimization

This paper proposes a general optimization strategy, which combines resu...

Please sign up or login with your details

Forgot password? Click here to reset