A partial information decomposition for discrete and continuous variables

06/23/2021
by   Kyle Schick-Poland, et al.
0

Conceptually, partial information decomposition (PID) is concerned with separating the information contributions several sources hold about a certain target by decomposing the corresponding joint mutual information into contributions such as synergistic, redundant, or unique information. Despite PID conceptually being defined for any type of random variables, so far, PID could only be quantified for the joint mutual information of discrete systems. Recently, a quantification for PID in continuous settings for two or three source variables was introduced. Nonetheless, no ansatz has managed to both quantify PID for more than three variables and cover general measure-theoretic random variables, such as mixed discrete-continuous, or continuous random variables yet. In this work we will propose an information quantity, defining the terms of a PID, which is well-defined for any number or type of source or target random variable. This proposed quantity is tightly related to a recently developed local shared information quantity for discrete random variables based on the idea of shared exclusions. Further, we prove that this newly proposed information-measure fulfills various desirable properties, such as satisfying a set of local PID axioms, invariance under invertible transformations, differentiability with respect to the underlying probability density, and admitting a target chain rule.

READ FULL TEXT
research
12/27/2018

On mutual information estimation for mixed-pair random variables

We study the mutual information estimation for mixed-pair random variabl...
research
01/30/2021

Estimating the Unique Information of Continuous Variables

Partial information decompositions (PIDs) identify different modes in wh...
research
05/11/2023

Computing Unique Information for Poisson and Multinomial Systems

Bivariate Partial Information Decomposition (PID) describes how the mutu...
research
04/01/2021

Reconciling the Discrete-Continuous Divide: Towards a Mathematical Theory of Sparse Communication

Neural networks and other machine learning models compute continuous rep...
research
12/23/2021

Signed and Unsigned Partial Information Decompositions of Continuous Network Interactions

We investigate the partial information decomposition (PID) framework as ...
research
08/05/2021

Sparse Communication via Mixed Distributions

Neural networks and other machine learning models compute continuous rep...
research
05/19/2019

Minimal Achievable Sufficient Statistic Learning

We introduce Minimal Achievable Sufficient Statistic (MASS) Learning, a ...

Please sign up or login with your details

Forgot password? Click here to reset