Bounds on the Entropy of a Function of a Random Variable and their Applications

12/21/2017
by   Ferdinando Cicalese, et al.
0

It is well known that the entropy H(X) of a discrete random variable X is always greater than or equal to the entropy H(f(X)) of a function f of X, with equality if and only if f is one-to-one. In this paper, we give tight bounds on H(f(X)) when the function f is not one-to-one, and we illustrate a few scenarios where this matters. As an intermediate step towards our main result, we derive a lower bound on the entropy of a probability distribution, when only a bound on the ratio between the maximal and minimal probabilities is known. The lower bound improves on previous results in the literature, and it could find applications outside the present scenario.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/08/2018

Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression

This paper provides tight bounds on the Rényi entropy of a function of a...
research
01/11/2018

An entropy inequality for symmetric random variables

We establish a lower bound on the entropy of weighted sums of (possibly ...
research
06/06/2013

Tight Lower Bound on the Probability of a Binomial Exceeding its Expectation

We give the proof of a tight lower bound on the probability that a binom...
research
10/23/2021

Signal to Noise Ratio Loss Function

This work proposes a new loss function targeting classification problems...
research
08/31/2022

Generalizing Körner's graph entropy to graphons

Körner introduced the notion of graph entropy in 1973 as the minimal cod...
research
01/15/2018

On the Distribution of Random Geometric Graphs

Random geometric graphs (RGGs) are commonly used to model networked syst...
research
09/03/2017

A short note on the joint entropy of n/2-wise independence

In this note, we prove a tight lower bound on the joint entropy of n unb...

Please sign up or login with your details

Forgot password? Click here to reset