Minimum--Entropy Couplings and their Applications
Given two discrete random variables X and Y, with probability distributions p=(p_1, ... , p_n) and q=(q_1, ... , q_m), respectively, denote by C( p, q) the set of all couplings of p and q, that is, the set of all bivariate probability distributions that have p and q as marginals. In this paper, we study the problem of finding a joint probability distribution in C( p, q) of minimum entropy (equivalently, a coupling that maximizes the mutual information between X and Y), and we discuss several situations where the need for this kind of optimization naturally arises. Since the optimization problem is known to be NP-hard, we give an efficient algorithm to find a joint probability distribution in C( p, q) with entropy exceeding the minimum possible at most by 1 bit, thus providing an approximation algorithm with an additive gap of at most 1 bit. Leveraging on this algorithm, we extend our result to the problem of finding a minimum--entropy joint distribution of arbitrary k≥ 2 discrete random variables X_1, ... , X_k, consistent with the known k marginal distributions of the individual random variables X_1, ... , X_k. In this case, our algorithm has an additive gap of at most k from optimum. We also discuss several related applications of our findings and extensions of our results to entropies different from the Shannon entropy.
READ FULL TEXT