
Entropy modulo a prime
Building on work of Kontsevich, we introduce a definition of the entropy...
read it

Harmonic numbers as the summation of integrals
Harmonic numbers arise from the truncation of the harmonic series. The n...
read it

A System of Billiard and Its Application to InformationTheoretic Entropy
In this article, we define an informationtheoretic entropy based on the...
read it

Outer approximations of core points for integer programming
For several decades the dominant techniques for integer linear programmi...
read it

A topological dynamical system with two different positive sofic entropies
A sofic approximation to a countable group is a sequence of partial acti...
read it

Reflections on Shannon Information: In search of a natural informationentropy for images
It is not obvious how to extend Shannon's original information entropy t...
read it

A combinatorial interpretation for Tsallis 2entropy
While Shannon entropy is related to the growth rate of multinomial coeff...
read it
The Group Theoretic Roots of Information I: permutations, symmetry, and entropy
We propose a new interpretation of measures of information and disorder by connecting these concepts to group theory in a new way. Entropy and group theory are connected here by their common relation to sets of permutations. A combinatorial measure of information and disorder is proposed, in terms of integers and discrete functions, that we call the integer entropy. The Shannon measure of information is the limiting case of a richer, more general conceptual structure that reveals relations among finite groups, information, and symmetries. It is shown that the integer entropy converges uniformly to the Shannon entropy when the group includes all permutations, the Symmetric group, and the number of objects increases without bound. The harmonic numbers have a wellknown combinatorial meaning as the expected number of disjoint, nonempty cycles in permutations of n objects, and since integer entropy is defined in terms of the expected value of the number of cycles over the set of permutations, it also has a clear combinatorial meaning. Since all finite groups are isomorphic to subgroups of the Symmetric group, every finite group has a corresponding information functional, analogous to the Shannon entropy and a number series, analogous to the harmonic numbers. The CameronSemeraro cycle polynomial is used to analyze the integer entropy for finite groups, and to characterize the series analogous to the Harmonic numbers. Broken symmetries and conserved quantities are linked through the cycle properties of the groups, and we define an entropy functional for every finite group.
READ FULL TEXT
Comments
There are no comments yet.