Information Measures for Entropy and Symmetry

11/27/2022
by   Daniel Lazarev, et al.
0

Entropy and information can be considered dual: entropy is a measure of the subspace defined by the information constraining the given ambient space. Negative entropies, arising in naïve extensions of the definition of entropy from discrete to continuous settings, are byproducts of the use of probabilities, which only work in the discrete case by a fortunate coincidence. We introduce notions such as sup-normalization and information measures, which allow for the appropriate generalization of the definition of entropy that keeps with the interpretation of entropy as a subspace volume. Applying this in the context of topological groups and Haar measures, we elucidate the relationship between entropy, symmetry, and uniformity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2018

The case for shifting the Renyi Entropy

We introduce a variant of the Rényi entropy definition that aligns it wi...
research
12/21/2022

Typicality for stratified measures

Stratified measures on Euclidean space are defined here as convex combin...
research
11/12/2021

Generalized active information: extensions to unbounded domains

In the last three decades, several measures of complexity have been prop...
research
02/18/2021

Entropy under disintegrations

We consider the differential entropy of probability measures absolutely ...
research
12/11/2021

Information entropy re-defined in a category theory context using preradicals

Algebraically, entropy can be defined for abelian groups and their endom...
research
02/25/2023

Two-Disk Compound Symmetry Groups

Symmetry is at the heart of much of mathematics, physics, and art. Tradi...
research
06/01/2023

On the entropy of rectifiable and stratified measures

We summarize some results of geometric measure theory concerning rectifi...

Please sign up or login with your details

Forgot password? Click here to reset