DLITE: The Discounted Least Information Theory of Entropy

02/18/2020
by   Weimao Ke, et al.
0

We propose an entropy-based information measure, namely the Discounted Least Information Theory of Entropy (DLITE), which not only exhibits important characteristics expected as an information measure but also satisfies conditions of a metric. Classic information measures such as Shannon Entropy, KL Divergence, and Jessen-Shannon Divergence have manifested some of these properties while missing others. This work fills an important gap in the advancement of information theory and its application, where related properties are desirable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/27/2022

Informational properties of the family of cubic rank transmuted distributions

Recently, cubic rank transmuted (CRT) distribution was introduced and st...
research
07/15/2019

On the Polarization of Rényi Entropy

Existing polarization theories have mostly been concerned with Shannon's...
research
05/01/2021

t-Entropy: A New Measure of Uncertainty with Some Applications

The concept of Entropy plays a key role in Information Theory, Statistic...
research
03/30/2021

A genuinely natural information measure

The theoretical measuring of information was famously initiated by Shann...
research
11/19/2019

On the Upper Bound of the Kullback-Leibler Divergence and Cross Entropy

This archiving article consists of several short reports on the discussi...
research
03/01/2018

Re-examination of Bregman functions and new properties of their divergences

The Bregman divergence (Bregman distance, Bregman measure of distance) i...
research
08/29/2019

The maximum entropy of a metric space

We define a one-parameter family of entropies, each assigning a real num...

Please sign up or login with your details

Forgot password? Click here to reset