Unifying the Brascamp-Lieb Inequality and the Entropy Power Inequality
The entropy power inequality (EPI) and the Brascamp-Lieb inequality (BLI) can be viewed as information inequalities concerning entropies of linear transformations of random variables. The EPI provides lower bounds for the entropy of linear transformations of random vectors with independent components. The BLI, on the other hand, provides upper bounds on the entropy of a random vector in terms of the entropies of its linear transformations. In this paper, we present a new entropy inequality that generalizes both the BLI and EPI by considering a variety of independence relations among the components of a random vector. Our main technical contribution is in the proof strategy that leverages the "doubling trick" to prove Gaussian optimality for certain entropy expressions under independence constraints.
READ FULL TEXT