The Design of Mutual Information

07/10/2019
by   Nicholas Carrara, et al.
0

We derive the functional form of mutual information (MI) from a set of design criteria and a principle of maximal sufficiency. The (MI) between two sets of propositions is a global quantifier of correlations and is implemented as a tool for ranking joint probability distributions with respect to said correlations. The derivation parallels the derivations of relative entropy with an emphasis on the behavior of independent variables. By constraining the functional I according to special cases, we arrive at its general functional form and hence establish a clear meaning behind its definition. We also discuss the notion of sufficiency and offer a new definition which broadens its applicability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2019

Factorized Mutual Information Maximization

We investigate the sets of joint probability distributions that maximize...
research
06/23/2023

Exact mutual information for lognormal random variables

Stochastic correlated observables with lognormal distribution are ubiqui...
research
06/19/2023

Beyond Normal: On the Evaluation of Mutual Information Estimators

Mutual information is a general statistical dependency measure which has...
research
02/08/2022

On Sibson's α-Mutual Information

We explore a family of information measures that stems from Rényi's α-Di...
research
02/01/2021

On conditional Sibson's α-Mutual Information

In this work, we analyse how to define a conditional version of Sibson's...
research
08/23/2018

Multivariate Extension of Matrix-based Renyi's α-order Entropy Functional

The matrix-based Renyi's α-order entropy functional was recently introdu...
research
04/26/2019

Towards a Non-Stochastic Information Theory

The δ-mutual information between uncertain variables is introduced as a ...

Please sign up or login with your details

Forgot password? Click here to reset