The case for shifting the Renyi Entropy

We introduce a variant of the Rényi entropy definition that aligns it with the well-known Hölder mean: in the new formulation, the r-th order Rényi Entropy is the logarithm of the inverse of the r-th order Hölder mean. This brings about new insights into the relationship of the Rényi entropy to quantities close to it, like the information potential and the partition function of statistical mechanics. We also provide expressions that allow us to calculate the Rényi entropies from the Shannon cross-entropy and the escort probabilities. Finally, we discuss why shifting the Rényi entropy is fruitful in some applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2020

Entropy of tropical holonomic sequences

We introduce tropical holonomic sequences of a given order and calculate...
research
11/27/2022

Information Measures for Entropy and Symmetry

Entropy and information can be considered dual: entropy is a measure of ...
research
10/10/2019

Network Entropy based on Cluster Expansion on Motifs for Undirected Graphs

The structure of the network can be described by motifs, which are subgr...
research
05/05/2020

An improved estimate of the inverse binary entropy function

Two estimates for the inverse binary entropy function are derived using ...
research
09/20/2022

Recurrence times, waiting times and universal entropy production estimators

The universal typical-signal estimators of entropy and cross entropy bas...
research
09/20/2022

On a waiting-time result of Kontoyiannis: mixing or decoupling?

We introduce conditions of lower decoupling to the study of waiting-time...
research
11/14/2004

Statistical Mechanics Characterization of Neuronal Mosaics

The spatial distribution of neuronal cells is an important requirement f...

Please sign up or login with your details

Forgot password? Click here to reset