An information upper bound for probability sensitivity

06/05/2022
by   Jiannan Yang, et al.
0

Uncertain input of a mathematical model induces uncertainties in the output and probabilistic sensitivity analysis identifies the influential inputs to guide decision-making. Of practical concern is the probability that the output would, or would not, exceed a threshold, and the probability sensitivity depends on this threshold which is often uncertain. The Fisher information and the Kullback-Leibler divergence have been recently proposed in the literature as threshold-independent sensitivity metrics. We present mathematical proof that the information-theoretical metrics provide an upper bound for the probability sensitivity. The proof is elementary, relying only on a special version of the Cauchy-Schwarz inequality called Titu's lemma. Despite various inequalities exist for probabilities, little is known of probability sensitivity bounds and the one proposed here is new to the present authors' knowledge. The probability sensitivity bound is extended, analytically and with numerical examples, to the Fisher information of both the input and output. It thus provides a solid mathematical basis for decision-making based on probabilistic sensitivity metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2022

A general framework for probabilistic sensitivity analysis with respect to distribution parameters

Probabilistic sensitivity analysis identifies the influential uncertain ...
research
04/02/2021

Decision-theoretic reliability sensitivity

We propose and discuss sensitivity metrics for reliability analysis, whi...
research
07/04/2012

Exploiting Evidence-dependent Sensitivity Bounds

Studying the effects of one-way variation of any number of parameters on...
research
11/30/2017

Why So Many Published Sensitivity Analyses Are False. A Systematic Review of Sensitivity Analysis Practices

Sensitivity analysis (SA) has much to offer for a very large class of ap...
research
08/08/2022

Kernel-based Global Sensitivity Analysis Obtained from a Single Data Set

Results from global sensitivity analysis (GSA) often guide the understan...
research
02/22/2021

Probabilistic Learning on Manifolds (PLoM) with Partition

The probabilistic learning on manifolds (PLoM) introduced in 2016 has so...
research
03/08/2023

Bounding the Probabilities of Benefit and Harm Through Sensitivity Parameters and Proxies

We present two methods for bounding the probabilities of benefit and har...

Please sign up or login with your details

Forgot password? Click here to reset