On the Shannon entropy of the number of vertices with zero in-degree in randomly oriented hypergraphs

08/13/2018
by   Christos Pelekis, et al.
0

Suppose that you have n colours and m mutually independent dice, each of which has r sides. Each dice lands on any of its sides with equal probability. You may colour the sides of each die in any way you wish, but there is one restriction: you are not allowed to use the same colour more than once on the sides of a die. Any other colouring is allowed. Let X be the number of different colours that you see after rolling the dice. How should you colour the sides of the dice in order to maximize the Shannon entropy of X? In this article we investigate this question. We show that the entropy of X is at most 1/2(n) + O(1) and that the bound is tight, up to a constant additive factor, in the case of there being equally many coins and colours. Our proof employs the differential entropy bound on discrete entropy, along with a lower bound on the entropy of binomial random variables whose outcome is conditioned to be an even integer. We conjecture that the entropy is maximized when the colours are distributed over the sides of the dice as evenly as possible.

READ FULL TEXT
research
08/03/2020

On the Maximum Entropy of a Sum of Independent Discrete Random Variables

Let X_1, …, X_n be independent random variables taking values in the alp...
research
10/12/2022

Approximate Discrete Entropy Monotonicity for Log-Concave Sums

It is proved that for any n ≥ 1, if X_1,…,X_n are i.i.d. integer-valued,...
research
06/23/2021

An Effective Bernstein-type Bound on Shannon Entropy over Countably Infinite Alphabets

We prove a Bernstein-type bound for the difference between the average o...
research
03/19/2019

Preprocessing Ambiguous Imprecise Points

Let R = {R_1, R_2, ..., R_n} be a set of regions and let X = {x_1, x_2,...
research
08/28/2023

Entropy-Based Strategies for Multi-Bracket Pools

Much work in the March Madness literature has discussed how to estimate ...
research
09/03/2017

A short note on the joint entropy of n/2-wise independence

In this note, we prove a tight lower bound on the joint entropy of n unb...
research
10/27/2018

Estimating Differential Entropy under Gaussian Convolutions

This paper studies the problem of estimating the differential entropy h(...

Please sign up or login with your details

Forgot password? Click here to reset