# The Group Theoretic Roots of Information I: permutations, symmetry, and entropy

We propose a new interpretation of measures of information and disorder by connecting these concepts to group theory in a new way. Entropy and group theory are connected here by their common relation to sets of permutations. A combinatorial measure of information and disorder is proposed, in terms of integers and discrete functions, that we call the integer entropy. The Shannon measure of information is the limiting case of a richer, more general conceptual structure that reveals relations among finite groups, information, and symmetries. It is shown that the integer entropy converges uniformly to the Shannon entropy when the group includes all permutations, the Symmetric group, and the number of objects increases without bound. The harmonic numbers have a well-known combinatorial meaning as the expected number of disjoint, non-empty cycles in permutations of n objects, and since integer entropy is defined in terms of the expected value of the number of cycles over the set of permutations, it also has a clear combinatorial meaning. Since all finite groups are isomorphic to subgroups of the Symmetric group, every finite group has a corresponding information functional, analogous to the Shannon entropy and a number series, analogous to the harmonic numbers. The Cameron-Semeraro cycle polynomial is used to analyze the integer entropy for finite groups, and to characterize the series analogous to the Harmonic numbers. Broken symmetries and conserved quantities are linked through the cycle properties of the groups, and we define an entropy functional for every finite group.

There are no comments yet.

## Authors

• 1 publication
• ### Entropy modulo a prime

Building on work of Kontsevich, we introduce a definition of the entropy...
03/16/2019 ∙ by Tom Leinster, et al. ∙ 0

• ### Harmonic numbers as the summation of integrals

Harmonic numbers arise from the truncation of the harmonic series. The n...
12/01/2021 ∙ by N. Karjanto, et al. ∙ 0

• ### A System of Billiard and Its Application to Information-Theoretic Entropy

In this article, we define an information-theoretic entropy based on the...
04/07/2020 ∙ by Supriyo Dutta, et al. ∙ 0

• ### Outer approximations of core points for integer programming

For several decades the dominant techniques for integer linear programmi...
07/21/2020 ∙ by David Bremner, et al. ∙ 0

• ### A topological dynamical system with two different positive sofic entropies

A sofic approximation to a countable group is a sequence of partial acti...
11/19/2019 ∙ by Dylan Airey, et al. ∙ 0

• ### Reflections on Shannon Information: In search of a natural information-entropy for images

It is not obvious how to extend Shannon's original information entropy t...
09/05/2016 ∙ by Kieran G. Larkin, et al. ∙ 0