Entropy versus variance for symmetric log-concave random variables and related problems

11/01/2018
by   Mokshay Madiman, et al.
0

We show that the uniform distribution minimises entropy among all symmetric log-concave distributions with fixed variance. We construct a counter-example regarding monotonicity and entropy comparison of weighted sums of independent identically distributed log-concave random variables.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2021

Rényi entropy and variance comparison for symmetric log-concave random variables

We show that for any α>0 the Rényi entropy of order α is minimized, amon...
research
09/04/2023

Minimum entropy of a log-concave variable for fixed variance

We show that for log-concave real random variables with fixed variance t...
research
04/04/2019

Two remarks on generalized entropy power inequalities

This note contributes to the understanding of generalized entropy power ...
research
02/18/2021

Convolution of a symmetric log-concave distribution and a symmetric bimodal distribution can have any number of modes

In this note, we show that the convolution of a discrete symmetric log-c...
research
05/24/2020

Sharp variance-entropy comparison for nonnegative gaussian quadratic forms

In this article we study quadratic forms in n independent standard norma...
research
04/04/2022

Scalable random number generation for truncated log-concave distributions

Inverse transform sampling is an exceptionally general method to generat...
research
07/23/2018

On Enumerating Distributions for Associated Vectors in the Entropy Space

This paper focuses on the problem of finding a distribution for an assoc...

Please sign up or login with your details

Forgot password? Click here to reset