Rényi entropy and variance comparison for symmetric log-concave random variables

08/23/2021
by   Maciej Białobrzeski, et al.
0

We show that for any α>0 the Rényi entropy of order α is minimized, among all symmetric log-concave random variables with fixed variance, either for a uniform distribution or for a two sided exponential distribution. The first case occurs for α∈ (0,α^*] and the second case for α∈ [α^*,∞), where α^* satisfies the equation 1/α^*-1logα^*= 1/2log 6, that is α^* ≈ 1.241.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2018

Entropy versus variance for symmetric log-concave random variables and related problems

We show that the uniform distribution minimises entropy among all symmet...
research
10/22/2020

On Mean Estimation for Heteroscedastic Random Variables

We study the problem of estimating the common mean μ of n independent sy...
research
12/18/2022

Entropy-variance inequalities for discrete log-concave random variables via degree of freedom

We utilize a discrete version of the notion of degree of freedom to prov...
research
05/24/2020

Sharp variance-entropy comparison for nonnegative gaussian quadratic forms

In this article we study quadratic forms in n independent standard norma...
research
02/16/2021

Sample variance of rounded variables

If the rounding errors are assumed to be distributed independently from ...
research
12/09/2020

Estimation of first-order sensitivity indices based on symmetric reflected Vietoris-Rips complexes areas

In this paper we estimate the first-order sensitivity index of random va...
research
10/17/2022

New metrics for risk analysis

This paper introduces a new framework for risk analysis for distribution...

Please sign up or login with your details

Forgot password? Click here to reset