Reflections on Shannon Information: In search of a natural information-entropy for images

09/05/2016
by   Kieran G. Larkin, et al.
0

It is not obvious how to extend Shannon's original information entropy to higher dimensions, and many different approaches have been tried. We replace the English text symbol sequence originally used to illustrate the theory by a discrete, bandlimited signal. Using Shannon's later theory of sampling we derive a new and symmetric version of the second order entropy in 1D. The new theory then naturally extends to 2D and higher dimensions, where by naturally we mean simple, symmetric, isotropic and parsimonious. Simplicity arises from the direct application of Shannon's joint entropy equalities and inequalities to the gradient (del) vector field image embodying the second order relations of the scalar image. Parsimony is guaranteed by halving of the vector data rate using Papoulis' generalized sampling expansion. The new 2D entropy measure, which we dub delentropy, is underpinned by a computable probability density function we call deldensity. The deldensity captures the underlying spatial image structure and pixel co-occurrence. It achieves this because each scalar image pixel value is nonlocally related to the entire gradient vector field. Both deldensity and delentropy are highly tractable and yield many interesting connections and useful inequalities. The new measure explicitly defines a realizable encoding algorithm and a corresponding reconstruction. Initial tests show that delentropy compares favourably with the conventional intensity-based histogram entropy and the compressed data rates of lossless image encoders (GIF, PNG, WEBP, JP2K-LS and JPG-LS) for a selection of images. The symmetric approach may have applications to higher dimensions and problems concerning image complexity measures.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 19

page 23

page 29

page 40

07/12/2018

Shannon and Rényi entropy rates of stationary vector valued Gaussian random processes

We derive expressions for the Shannon and Rényi entropy rates of station...
01/22/2019

How to Use Undiscovered Information Inequalities: Direct Applications of the Copy Lemma

We discuss linear programming techniques that help to deduce corollaries...
07/11/2021

Jaynes Shannon's Constrained Ignorance and Surprise

In this simple article, with possible applications in theoretical and ap...
07/13/2018

A combinatorial interpretation for Tsallis 2-entropy

While Shannon entropy is related to the growth rate of multinomial coeff...
08/23/2019

The Group Theoretic Roots of Information I: permutations, symmetry, and entropy

We propose a new interpretation of measures of information and disorder ...
03/30/2021

A genuinely natural information measure

The theoretical measuring of information was famously initiated by Shann...
03/04/2020

A homological characterization of generalized multinomial coefficients related to the entropic chain rule

There is an asymptotic relationship between the multiplicative relations...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.