The maximum entropy of a metric space

08/29/2019
by   Tom Leinster, et al.
0

We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These entropies generalise the Shannon and Rényi entropies of information theory. We prove that on any space X, there is a single probability measure maximising all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows; its asymptotics determine geometric information about X, including the volume and dimension. We also study the large-scale limit of the maximising measure itself, arguing that it should be regarded as the canonical or uniform measure on X. Primarily we work not with entropy itself but its exponential, called diversity and (for finite spaces) used as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2020

DLITE: The Discounted Least Information Theory of Entropy

We propose an entropy-based information measure, namely the Discounted L...
research
11/12/2021

Generalized active information: extensions to unbounded domains

In the last three decades, several measures of complexity have been prop...
research
03/02/2023

Categorical magnitude and entropy

Given any finite set equipped with a probability measure, one may comput...
research
10/01/2020

Recoverable Systems

Motivated by the established notion of storage codes, we consider sets o...
research
07/30/2021

Representing Pareto optima in preordered spaces: from Shannon entropy to injective monotones

Shannon entropy is the most widely used measure of uncertainty. It is us...
research
05/06/2022

Far from Asymptopia

Inference from limited data requires a notion of measure on parameter sp...
research
12/14/2020

Higher order information volume of mass function

For a certain moment, the information volume represented in a probabilit...

Please sign up or login with your details

Forgot password? Click here to reset