The Compound Information Bottleneck Outlook
We formulate and analyze the compound information bottleneck programming. In this problem, a Markov chain π·βπΈβπΉ is assumed with fixed marginal distributions π―_π· and π―_πΈ, and the mutual information between π· and πΉ is sought to be maximized over the choice of conditional probability of πΉ given πΈ from a given class, under the worst choice of the joint probability of the pair (π·,πΈ) from a different class. We consider several classes based on extremes of: mutual information; minimal correlation; total variation; and the relative entropy class. We provide values, bounds, and various characterizations for specific instances of this problem: the binary symmetric case, the scalar Gaussian case, the vector Gaussian case and the symmetric modulo-additive case. Finally, for the general case, we propose a Blahut-Arimoto type of alternating iterations algorithm to find a consistent solution to this problem.
READ FULL TEXT