Minimum entropy of a log-concave variable for fixed variance

09/04/2023
by   James Melbourne, et al.
0

We show that for log-concave real random variables with fixed variance the Shannon differential entropy is minimized for an exponential random variable. We apply this result to derive upper bounds on capacities of additive noise channels with log-concave noise. We also improve constants in the reverse entropy power inequalities for log-concave random variables.

READ FULL TEXT
research
11/01/2018

Entropy versus variance for symmetric log-concave random variables and related problems

We show that the uniform distribution minimises entropy among all symmet...
research
10/12/2022

Approximate Discrete Entropy Monotonicity for Log-Concave Sums

It is proved that for any n ≥ 1, if X_1,…,X_n are i.i.d. integer-valued,...
research
12/18/2022

Entropy-variance inequalities for discrete log-concave random variables via degree of freedom

We utilize a discrete version of the notion of degree of freedom to prov...
research
05/17/2022

Moments, Concentration, and Entropy of Log-Concave Distributions

We utilize and extend a simple and classical mechanism, combining log-co...
research
03/29/2021

Exact converses to a reverse AM–GM inequality, with applications to sums of independent random variables and (super)martingales

For every given real value of the ratio μ:=A_X/G_X>1 of the arithmetic a...
research
01/16/2020

Constrained Functional Value under General Convexity Conditions with Applications to Distributed Simulation

We show a general phenomenon of the constrained functional value for den...
research
01/21/2019

Dual Loomis-Whitney inequalities via information theory

We establish lower bounds on the volume and the surface area of a geomet...

Please sign up or login with your details

Forgot password? Click here to reset