An entropy inequality for symmetric random variables

01/11/2018
by   Jing Hao, et al.
0

We establish a lower bound on the entropy of weighted sums of (possibly dependent) random variables (X_1, X_2, ..., X_n) possessing a symmetric joint distribution. Our lower bound is in terms of the joint entropy of (X_1, X_2, ..., X_n). We show that for n ≥ 3, the lower bound is tight if and only if X_i's are i.i.d. Gaussian random variables. For n=2 there are numerous other cases of equality apart from i.i.d. Gaussians, which we completely characterize. Going beyond sums, we also present an inequality for certain linear transformations of (X_1, ..., X_n). Our primary technical contribution lies in the analysis of the equality cases, and our approach relies on the geometry and the symmetry of the problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2019

An Entropy Power Inequality for Discrete Random Variables

Let N_ d[X]=1/2π e e^2H[X] denote the entropy power of the discrete rand...
research
02/16/2021

Lower bound on Wyner's Common Information

An important notion of common information between two random variables i...
research
09/03/2017

A short note on the joint entropy of n/2-wise independence

In this note, we prove a tight lower bound on the joint entropy of n unb...
research
12/21/2017

Bounds on the Entropy of a Function of a Random Variable and their Applications

It is well known that the entropy H(X) of a discrete random variable X i...
research
01/10/2022

Decision Trees with Soft Numbers

In the classical probability in continuous random variables there is no ...
research
04/24/2018

Rate-Distortion Theory for General Sets and Measures

This paper is concerned with a rate-distortion theory for sequences of i...
research
01/21/2019

Equality in the Matrix Entropy-Power Inequality and Blind Separation of Real and Complex sources

The matrix version of the entropy-power inequality for real or complex c...

Please sign up or login with your details

Forgot password? Click here to reset