Error Bounds on a Mixed Entropy Inequality

05/29/2018
by   James Melbourne, et al.
0

Motivated by the entropy computations relevant to the evaluation of decrease in entropy in bit reset operations, the authors investigate the deficit in an entropic inequality involving two independent random variables, one continuous and the other discrete. In the case where the continuous random variable is Gaussian, we derive strong quantitative bounds on the deficit in the inequality. More explicitly it is shown that the decay of the deficit is sub-Gaussian with respect to the reciprocal of the standard deviation of the Gaussian variable. What is more, up to rational terms these results are shown to be sharp.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2018

The deficit in an entropic inequality

In this article, we investigate the entropy of a sum of a discrete and a...
research
01/20/2019

Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality

The entropy power inequality (EPI) and the Brascamp-Lieb inequality (BLI...
research
02/08/2021

Variations on a Theme by Massey

In 1994, James Lee Massey proposed the guessing entropy as a measure of ...
research
08/24/2018

The Entropy Power Inequality with quantum conditioning

The conditional Entropy Power Inequality is a fundamental inequality in ...
research
09/03/2022

The Gaussian product inequality conjecture for multinomial covariances

In this paper, we find an equivalent combinatorial condition only involv...
research
01/10/2022

Decision Trees with Soft Numbers

In the classical probability in continuous random variables there is no ...
research
03/29/2021

Asymptotically Optimal Massey-Like Inequality on Guessing Entropy With Application to Side-Channel Attack Evaluations

A Massey-like inequality is any useful lower bound on guessing entropy i...

Please sign up or login with your details

Forgot password? Click here to reset