An axiomatic characterization of mutual information

08/28/2021
by   James Fullwood, et al.
0

We characterize mutual information as the unique map on ordered pairs of random variables satisfying a set of axioms similar to those of Faddeev's characterization of the Shannon entropy. There is a new axiom in our characterization however which has no analogue for Shannon entropy, based on the notion of a Markov triangle, which may be thought of as a composition of communication channels for which conditional entropy acts functorially. Our proofs are coordinate-free in the sense that no logarithms appear in our calculations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2019

Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

This paper proposes a new method for estimating the joint probability ma...
research
07/05/2021

The information loss of a stochastic map

We provide a stochastic extension of the Baez-Fritz-Leinster characteriz...
research
12/22/2022

Markov Categories and Entropy

Markov categories are a novel framework to describe and treat problems i...
research
01/29/2021

An Information Bottleneck Problem with Rényi's Entropy

This paper considers an information bottleneck problem with the objectiv...
research
02/18/2022

Information Decomposition Diagrams Applied beyond Shannon Entropy: A Generalization of Hu's Theorem

In information theory, one major goal is to find useful functions that s...
research
08/10/2023

A Characterization of Entropy as a Universal Monoidal Natural Transformation

We show that the essential properties of entropy (monotonicity, additivi...
research
09/08/2018

Hybrid Statistical Estimation of Mutual Information and its Application to Information Flow

Analysis of a probabilistic system often requires to learn the joint pro...

Please sign up or login with your details

Forgot password? Click here to reset