Remarks on the Rényi Entropy of a sum of IID random variables

04/17/2019
by   Benjamin Jaye, et al.
0

In this note we study a conjecture of Madiman and Wang which predicted that the generalized Gaussian distribution minimizes the Rényi entropy of the sum of independent random variables. Through a variational analysis, we show that the generalized Gaussian fails to be a minimizer for the problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2019

An Entropy Power Inequality for Discrete Random Variables

Let N_ d[X]=1/2π e e^2H[X] denote the entropy power of the discrete rand...
research
01/09/2022

Arithmetic Network Coding for Secret Sum Computation

We consider a network coding problem where the destination wants to reco...
research
10/27/2018

Estimating Differential Entropy under Gaussian Convolutions

This paper studies the problem of estimating the differential entropy h(...
research
08/30/2023

On the entropy and information of Gaussian mixtures

We establish several convexity properties for the entropy and Fisher inf...
research
05/24/2020

Sharp variance-entropy comparison for nonnegative gaussian quadratic forms

In this article we study quadratic forms in n independent standard norma...
research
10/14/2021

On Efficient Range-Summability of IID Random Variables in Two or Higher Dimensions

d-dimensional efficient range-summability (dD-ERS) of a long list of ran...
research
03/22/2023

Generalized Data Thinning Using Sufficient Statistics

Our goal is to develop a general strategy to decompose a random variable...

Please sign up or login with your details

Forgot password? Click here to reset