Information Decomposition Diagrams Applied beyond Shannon Entropy: A Generalization of Hu's Theorem

02/18/2022
by   Leon Lang, et al.
0

In information theory, one major goal is to find useful functions that summarize the amount of information contained in the interaction of several random variables. Specifically, one can ask how the classical Shannon entropy, mutual information, and higher interaction information functions relate to each other. This is formally answered by Hu's theorem, which is widely known in the form of information diagrams: it relates disjoint unions of shapes in a Venn diagram to summation rules of information functions; this establishes a bridge from set theory to information theory. While a proof of this theorem is known, to date it was not analyzed in detail in what generality it could be established. In this work, we view random variables together with the joint operation as a monoid that acts by conditioning on information functions, and entropy as the unique function satisfying the chain rule of information. This allows us to abstract away from Shannon's theory and to prove a generalization of Hu's theorem, which applies to Shannon entropy of countably infinite discrete random variables, Kolmogorov complexity, Tsallis entropy, (Tsallis) Kullback-Leibler Divergence, cross-entropy, submodular information functions, and the generalization error in machine learning. Our result implies for Chaitin's prefix-free Kolmogorov complexity that the higher-order interaction complexities of all degrees are in expectation close to Shannon interaction information. For well-behaved probability distributions on increasing sequence lengths, this shows that asymptotically, the per-bit expected interaction complexity and information coincide, thus showing a strong bridge between algorithmic and classical information theory.

READ FULL TEXT

page 14

page 19

page 20

page 42

research
07/14/2023

A Poisson Decomposition for Information and the Information-Event Diagram

Information diagram and the I-measure are useful mnemonics where random ...
research
06/15/2019

Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

This paper proposes a new method for estimating the joint probability ma...
research
08/28/2021

An axiomatic characterization of mutual information

We characterize mutual information as the unique map on ordered pairs of...
research
06/09/2022

Negative Shannon Information Hides Networks

Negative numbers are essential in mathematics. They are not needed to de...
research
07/01/2019

Rate Distortion Theorem and the Multicritical Point of Spin Glass

A spin system can be thought of as an information coding system that tra...
research
10/20/2020

Inequalities for space-bounded Kolmogorov complexity

There is a parallelism between Shannon information theory and algorithmi...
research
08/22/2023

Equivalence Principle of the P-value and Mutual Information

In this paper, we propose a novel equivalence between probability theory...

Please sign up or login with your details

Forgot password? Click here to reset