Hybrid Statistical Estimation of Mutual Information and its Application to Information Flow

09/08/2018
by   Fabrizio Biondi, et al.
0

Analysis of a probabilistic system often requires to learn the joint probability distribution of its random variables. The computation of the exact distribution is usually an exhaustive precise analysis on all executions of the system. To avoid the high computational cost of such an exhaustive search, statistical analysis has been studied to efficiently obtain approximate estimates by analyzing only a small but representative subset of the system's behavior. In this paper we propose a hybrid statistical estimation method that combines precise and statistical analyses to estimate mutual information, Shannon entropy, and conditional entropy, together with their confidence intervals. We show how to combine the analyses on different components of a discrete system with different accuracy to obtain an estimate for the whole system. The new method performs weighted statistical analysis with different sample sizes over different components and dynamically finds their optimal sample sizes. Moreover, it can reduce sample sizes by using prior knowledge about systems and a new abstraction-then-sampling technique based on qualitative analysis. To apply the method to the source code of a system, we show how to decompose the code into components and to determine the analysis method for each component by overviewing the implementation of those techniques in the HyLeak tool. We demonstrate with case studies that the new method outperforms the state of the art in quantifying information leakage.

READ FULL TEXT
research
06/15/2019

Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

This paper proposes a new method for estimating the joint probability ma...
research
02/02/2022

Investigation of Alternative Measures for Mutual Information

Mutual information I(X;Y) is a useful definition in information theory t...
research
08/28/2021

An axiomatic characterization of mutual information

We characterize mutual information as the unique map on ordered pairs of...
research
11/09/2018

Streaming Generalized Cross Entropy

We propose a new method to combine adaptive processes with a class of en...
research
12/12/2021

Optimal Partitions for Nonparametric Multivariate Entropy Estimation

Efficient and accurate estimation of multivariate empirical probability ...
research
06/09/2022

Negative Shannon Information Hides Networks

Negative numbers are essential in mathematics. They are not needed to de...
research
01/14/2014

Survey On The Estimation Of Mutual Information Methods as a Measure of Dependency Versus Correlation Analysis

In this survey, we present and compare different approaches to estimate ...

Please sign up or login with your details

Forgot password? Click here to reset