
An entropy inequality for symmetric random variables
We establish a lower bound on the entropy of weighted sums of (possibly ...
read it

Aiming Low Is Harder  Inductive Proof Rules for Lower Bounds on Weakest Preexpectations in Probabilistic Program Verification
We present a new inductive proof rule for reasoning about lower bounds o...
read it

RateDistortion Theory for General Sets and Measures
This paper is concerned with a ratedistortion theory for sequences of i...
read it

Learning Sums of Independent Random Variables with Sparse Collective Support
We study the learnability of sums of independent integer random variable...
read it

A short note on the joint entropy of n/2wise independence
In this note, we prove a tight lower bound on the joint entropy of n unb...
read it

Gaussian Lower Bound for the Information Bottleneck Limit
The Information Bottleneck (IB) is a conceptual method for extracting th...
read it

A Forest Mixture Bound for BlockFree Parallel Inference
Coordinate ascent variational inference is an important algorithm for in...
read it
Lower bound on Wyner's Common Information
An important notion of common information between two random variables is due to Wyner. In this paper, we derive a lower bound on Wyner's common information for continuous random variables. The new bound improves on the only other general lower bound on Wyner's common information, which is the mutual information. We also show that the new lower bound is tight for the socalled "Gaussian channels" case, namely, when the joint distribution of the random variables can be written as the sum of a single underlying random variable and Gaussian noises. We motivate this work from the recent variations of Wyner's common information and applications to network data compression problems such as the GrayWyner network.
READ FULL TEXT
Comments
There are no comments yet.