The deficit in an entropic inequality

05/29/2018
by   James Melbourne, et al.
0

In this article, we investigate the entropy of a sum of a discrete and a continuous random variable. Bounds for estimating the entropy of the sum are obtained for the cases when the continuous random variable is Gaussian or log-concave. Bounds on the capacity of a channel where the discrete random variable is the input and the output is the input corrupted by additive noise modeled by the continuous random variable are obtained. The bounds are shown to be sharp in the case that the discrete variable is Bernoulli.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2018

Error Bounds on a Mixed Entropy Inequality

Motivated by the entropy computations relevant to the evaluation of decr...
research
05/21/2020

Reversals of Rényi Entropy Inequalities under Log-Concavity

We establish a discrete analog of the Rényi entropy comparison due to Bo...
research
02/24/2021

Saturable Generalizations of Jensen's Inequality

Jensen's inequality can be thought as answering the question of how know...
research
02/08/2021

Variations on a Theme by Massey

In 1994, James Lee Massey proposed the guessing entropy as a measure of ...
research
12/08/2018

Tight Bounds on the Rényi Entropy via Majorization with Applications to Guessing and Compression

This paper provides tight bounds on the Rényi entropy of a function of a...
research
06/25/2018

Testability of the exclusion restriction in continuous instrumental variable models

In this note we prove Pearl's conjecture, showing that the exclusion res...
research
05/06/2017

Nonlinear Information Bottleneck

Information bottleneck [IB] is a technique for extracting information in...

Please sign up or login with your details

Forgot password? Click here to reset