Introduction to Logical Entropy and its Relationship to Shannon Entropy

12/03/2021
by   David Ellerman, et al.
0

We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an "amount of information," but, as he pointed out, "no concept of information itself was defined." Logical entropy provides that definition. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. It is the (normalized) quantitative measure of the distinctions of a partition on a set–just as the Boole-Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. And partitions and subsets are mathematically dual concepts–so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name "logical entropy." The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. The Shannon entropy is shown to also be based on this notion of information-as-distinctions; it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. And finally logical entropy linearizes naturally to the corresponding quantum concept. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state. Keywords: logical entropy, Shannon entropy, partitions, MaxEntropy, quantum logical entropy, von Neumann entropy

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2015

Information entropy as an anthropomorphic concept

According to E.T. Jaynes and E.P. Wigner, entropy is an anthropomorphic ...
research
02/26/2020

Quantifying daseinisation using Shannon entropy

Topos formalism for quantum mechanics is interpreted in a broader, infor...
research
07/15/2019

On the Polarization of Rényi Entropy

Existing polarization theories have mostly been concerned with Shannon's...
research
04/11/2020

Combinatorial Decision Dags: A Natural Computational Model for General Intelligence

A novel computational model (CoDD) utilizing combinatory logic to create...
research
02/19/2021

On a Variational Definition for the Jensen-Shannon Symmetrization of Distances based on the Information Radius

We generalize the Jensen-Shannon divergence by considering a variational...
research
03/20/2022

Qualia as physical measurements: a mathematical model of qualia and pure concepts

A space of qualia is defined to be a sober topological space whose point...

Please sign up or login with your details

Forgot password? Click here to reset