The information loss of a stochastic map

07/05/2021
by   James Fullwood, et al.
0

We provide a stochastic extension of the Baez-Fritz-Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call `conditional information loss.' Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an `entropic Bayes' rule' for information measures, and we provide a characterization of conditional entropy in terms of this rule.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2021

An axiomatic characterization of mutual information

We characterize mutual information as the unique map on ordered pairs of...
research
08/10/2023

A Characterization of Entropy as a Universal Monoidal Natural Transformation

We show that the essential properties of entropy (monotonicity, additivi...
research
03/31/2017

A New Measure of Conditional Dependence

Measuring conditional dependencies among the variables of a network is o...
research
06/01/2023

On the entropy of rectifiable and stratified measures

We summarize some results of geometric measure theory concerning rectifi...
research
04/16/2019

One-adhesive polymatroids

Adhesive polymatroids were defined by F. Matúš motivated by entropy func...
research
07/07/2021

Information-theoretic characterization of the complete genotype-phenotype map of a complex pre-biotic world

How information is encoded in bio-molecular sequences is difficult to qu...
research
05/10/2023

Supervised learning with probabilistic morphisms and kernel mean embeddings

In this paper I propose a concept of a correct loss function in a genera...

Please sign up or login with your details

Forgot password? Click here to reset