Efficient estimation of divergence-based sensitivity indices with Gaussian process surrogates

04/08/2019
by   A. W. Eggels, et al.
0

We consider the estimation of sensitivity indices based on divergence measures such as Kullback-Leibler divergence. For sensitivity analysis of complex models, these divergence-based indices can be estimated by Monte-Carlo sampling (MCS) in combination with kernel density estimation (KDE). In a direct approach, the complex model must be evaluated at every input point generated by MCS, resulting in samples in the input-output space that can be used for density estimation. However, if the computational cost of the complex model strongly limits the number of model evaluations, this direct method gives large errors. A recent method uses polynomial dimensional decomposition (PDD), which assumes the input variables are independent. To avoid the assumption of independent inputs, we propose to use Gaussian process (GP) surrogates to increase the number of samples in the combined input-output space. By enlarging this sample set, the KDE becomes more accurate, leading to improved estimates. We investigate two estimators: one in which only the GP mean is used, and one which also accounts for the GP prediction variance. We assess the performance of both estimators, demonstrating they outperform the PDD-based method. We find the estimator based on the GP mean of the Gaussian process performs best.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2019

Identifying the Influential Inputs for Network Output Variance Using Sparse Polynomial Chaos Expansion

Sensitivity analysis (SA) is an important aspect of process automation. ...
research
02/22/2023

Factors Influencing Autonomously Generated 3D Geophysical Spatial Models

Understanding the contribution of geophysical variables is vital for ide...
research
04/16/2020

Gaussian Process Learning-based Probabilistic Optimal Power Flow

In this letter, we present a novel Gaussian Process Learning-based Proba...
research
06/03/2020

Gaussian linear approximation for the estimation of the Shapley effects

In this paper, we address the estimation of the sensitivity indices call...
research
01/13/2022

Density Estimation from Schlieren Images through Machine Learning

This study proposes a radically alternate approach for extracting quanti...
research
06/05/2023

Multiple output samples for each input in a single-output Gaussian process

The standard Gaussian Process (GP) only considers a single output sample...
research
08/03/2020

Parametric Copula-GP model for analyzing multidimensional neuronal and behavioral relationships

One of the main challenges in current systems neuroscience is the analys...

Please sign up or login with your details

Forgot password? Click here to reset