Quantifying multivariate redundancy with maximum entropy decompositions of mutual information

08/13/2017
by   Daniel Chicharro, et al.
0

Williams and Beer (2010) proposed a nonnegative mutual information decomposition, based on the construction of redundancy lattices, which allows separating the information that a set of variables contains about a target variable into nonnegative components interpretable as the unique information of some variables not contained in others as well as redundant and synergistic components. However, the definition of multivariate measures of redundancy that comply with nonnegativity and conform to certain axioms that capture conceptually desirable properties of redundancy has proven to be elusive. We here present a procedure to determine multivariate redundancy measures, within the framework of maximum entropy models. In particular, we generalize existing bivariate maximum entropy-based measures of redundancy and unique information, defining measures of the redundant information that a group of variables has about a target, and of the unique redundant information that a group of variables has about a target that is not redundant with information from another group. The two key ingredients for this approach are: First, the identification of a type of constraints on entropy maximization that allows isolating components of redundancy and unique redundancy by mirroring them to synergy components. Second, the construction of rooted tree-based decompositions to breakdown mutual information, ensuring nonnegativity by the local implementation of maximum entropy information projections at each binary unfolding of the tree nodes. Altogether, the proposed measures are nonnegative and conform to the desirable axioms for redundancy measures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2017

Invariant components of synergy, redundancy, and unique information among three variables

In a system of three stochastic variables, the Partial Information Decom...
research
03/06/2018

Exact partial information decompositions for Gaussian systems based on dependency constraints

The Partial Information Decomposition (PID) [arXiv:1004.2515] provides a...
research
05/27/2022

Finding Patterns in Visualized Data by Adding Redundant Visual Information

We present "PATRED", a technique that uses the addition of redundant inf...
research
01/10/2019

MAXENT3D_PID: An Estimator for the Maximum-entropy Trivariate Partial Information Decomposition

Chicharro (2017) introduced a procedure to determine multivariate partia...
research
11/13/2017

The identity of information: how deterministic dependencies constrain information synergy and redundancy

Understanding how different information sources together transmit inform...
research
09/26/2019

Generalised Measures of Multivariate Information Content

The entropy of a pair of random variables is commonly depicted using a V...

Please sign up or login with your details

Forgot password? Click here to reset