An entropy functional bounded from above by one

04/20/2022
by   John Çamkıran, et al.
0

Shannon entropy is widely used for quantifying uncertainty in discrete random variables. But when normalized to the unit interval, as is often done in practice, it fails to convey the alphabet size of the random variable under study. This work introduces an entropy functional based on Jensen-Shannon divergence that is naturally bounded from above by one. Unlike normalized Shannon entropy, this new functional is strictly increasing in alphabet size under uniformity and is thus well suited to the characterization of discrete random variables.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2022

Approximate Discrete Entropy Monotonicity for Log-Concave Sums

It is proved that for any n ≥ 1, if X_1,…,X_n are i.i.d. integer-valued,...
research
07/30/2021

Representing Pareto optima in preordered spaces: from Shannon entropy to injective monotones

Shannon entropy is the most widely used measure of uncertainty. It is us...
research
03/08/2016

A non-extensive entropy feature and its application to texture classification

This paper proposes a new probabilistic non-extensive entropy feature fo...
research
05/31/2018

Simulation of Random Variables under Rényi Divergence Measures of All Orders

The random variable simulation problem consists in using a k-dimensional...
research
09/26/2019

Entropic matroids and their representation

This paper investigates entropic matroids, that is, matroids whose rank ...
research
11/17/2020

Change the coefficients of conditional entropies in extensivity

The Boltzmann–Gibbs entropy is a functional on the space of probability ...
research
05/19/2018

An optimal approximation of discrete random variables with respect to the Kolmogorov distance

We present an algorithm that takes a discrete random variable X and a nu...

Please sign up or login with your details

Forgot password? Click here to reset