Learn your entropy from informative data: an axiom ensuring the consistent identification of generalized entropies

01/13/2023
by   Andrea Somazzi, et al.
0

Shannon entropy, a cornerstone of information theory, statistical physics and inference methods, is uniquely identified by the Shannon-Khinchin or Shore-Johnson axioms. Generalizations of Shannon entropy, motivated by the study of non-extensive or non-ergodic systems, relax some of these axioms and lead to entropy families indexed by certain `entropic' parameters. In general, the selection of these parameters requires pre-knowledge of the system or encounters inconsistencies. Here we introduce a simple axiom for any entropy family: namely, that no entropic parameter can be inferred from a completely uninformative (uniform) probability distribution. When applied to the Uffink-Jizba-Korbel and Hanel-Thurner entropies, the axiom selects only Rényi entropy as viable. It also extends consistency with the Maximum Likelihood principle, which can then be generalized to estimate the entropic parameter purely from data, as we confirm numerically. Remarkably, in a generalized maximum-entropy framework the axiom implies that the maximized log-likelihood always equals minus Shannon entropy, even if the inferred probability distribution maximizes a generalized entropy and not Shannon's, solving a series of problems encountered in previous approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2022

Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy

Shannon's entropy is one of the building blocks of information theory an...
research
07/11/2021

Jaynes Shannon's Constrained Ignorance and Surprise

In this simple article, with possible applications in theoretical and ap...
research
12/16/2015

Effects of GIMP Retinex Filtering Evaluated by the Image Entropy

A GIMP Retinex filtering can be used for enhancing images, with good res...
research
03/12/2022

Maximization of Mathai's Entropy under the Constraints of Generalized Gini and Gini mean difference indices and its Applications in Insurance

Statistical Physics, Diffusion Entropy Analysis and Information Theory c...
research
07/11/2019

Entropy Estimation of Physically Unclonable Functions via Chow Parameters

A physically unclonable function (PUF) is an electronic circuit that pro...
research
10/06/2022

The Shannon Entropy of a Histogram

The histogram is a key method for visualizing data and estimating the un...
research
01/07/2018

Shannon Information Entropy in Heavy-ion Collisions

The general idea of information entropy provided by C.E. Shannon "hangs ...

Please sign up or login with your details

Forgot password? Click here to reset