Sharp variance-entropy comparison for nonnegative gaussian quadratic forms

05/24/2020
by   Piotr Nayar, et al.
0

In this article we study quadratic forms in n independent standard normal random variables. We show that among nonnegative quadratic forms, a diagonal form with equal coefficients maximizes differential entropy when variance is fixed. We also prove that differential entropy of a weighted sum of i.i.d. exponential random variables with nonnegative weights is maximized, under fixed variance, when the weights are equal.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2018

Entropy versus variance for symmetric log-concave random variables and related problems

We show that the uniform distribution minimises entropy among all symmet...
research
04/17/2019

Remarks on the Rényi Entropy of a sum of IID random variables

In this note we study a conjecture of Madiman and Wang which predicted t...
research
08/23/2021

Rényi entropy and variance comparison for symmetric log-concave random variables

We show that for any α>0 the Rényi entropy of order α is minimized, amon...
research
01/25/2019

Concentration of quadratic forms under a Bernstein moment assumption

A concentration result for quadratic form of independent subgaussian ran...
research
03/01/2021

Bernoulli sums and Rényi entropy inequalities

We investigate the Rényi entropy of independent sums of integer valued r...
research
03/10/2018

Variance Networks: When Expectation Does Not Meet Your Expectations

In this paper, we propose variance networks, a new model that stores the...
research
11/14/2019

Estimating differential entropy using recursive copula splitting

A method for estimating the Shannon differential entropy of multidimensi...

Please sign up or login with your details

Forgot password? Click here to reset