A short note on the joint entropy of n/2-wise independence

09/03/2017
by   Amey Bhangale, et al.
0

In this note, we prove a tight lower bound on the joint entropy of n unbiased Bernoulli random variables which are n/2-wise independent. For general k-wise independence, we give new lower bounds by adapting Navon and Samorodnitsky's Fourier proof of the `LP bound' on error correcting codes. This counts as partial progress on a problem asked by Gavinsky and Pudlák.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/11/2018

An entropy inequality for symmetric random variables

We establish a lower bound on the entropy of weighted sums of (possibly ...
research
02/16/2021

Lower bound on Wyner's Common Information

An important notion of common information between two random variables i...
research
01/20/2019

Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality

The entropy power inequality (EPI) and the Brascamp-Lieb inequality (BLI...
research
12/21/2017

Bounds on the Entropy of a Function of a Random Variable and their Applications

It is well known that the entropy H(X) of a discrete random variable X i...
research
02/10/2023

On the lower bound for the length of minimal codes

In recent years, many connections have been made between minimal codes, ...
research
01/26/2023

A Bound for Stieltjes Constants

The goal of this note is to improve on the currently available bounds fo...
research
08/13/2018

On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs

Suppose that you have n colours and m mutually independent dice, each of...

Please sign up or login with your details

Forgot password? Click here to reset