
Information bottleneck through variational glasses
Information bottleneck (IB) principle [1] has become an important elemen...
read it

Learning Representations in Reinforcement Learning:An Information Bottleneck Approach
The information bottleneck principle is an elegant and useful approach t...
read it

A Class of Nonbinary Symmetric Information Bottleneck Problems
We study two dual settings of information processing. Let 𝖸→𝖷→𝖶 be a Mar...
read it

Variational Information Maximisation for Intrinsically Motivated Reinforcement Learning
The mutual information is a core statistical quantity that has applicati...
read it

The Dual Information Bottleneck
The Information Bottleneck (IB) framework is a general characterization ...
read it

General Information Bottleneck Objectives and their Applications to Machine Learning
We view the Information Bottleneck Principle (IBP: Tishby et al., 1999; ...
read it

Gaussian Lower Bound for the Information Bottleneck Limit
The Information Bottleneck (IB) is a conceptual method for extracting th...
read it
On the Difference Between the Information Bottleneck and the Deep Information Bottleneck
Combining the Information Bottleneck model with deep learning by replacing mutual information terms with deep neural nets has proved successful in areas ranging from generative modelling to interpreting deep neural networks. In this paper, we revisit the Deep Variational Information Bottleneck and the assumptions needed for its derivation. The two assumed properties of the data X, Y and their latent representation T take the form of two Markov chains TXY and XTY. Requiring both to hold during the optimisation process can be limiting for the set of potential joint distributions P(X,Y,T). We therefore show how to circumvent this limitation by optimising a lower bound for I(T;Y) for which only the latter Markov chain has to be satisfied. The actual mutual information consists of the lower bound which is optimised in DVIB and cognate models in practice and of two terms measuring how much the former requirement TXY is violated. Finally, we propose to interpret the family of information bottleneck models as directed graphical models and show that in this framework the original and deep information bottlenecks are special cases of a fundamental IB model.
READ FULL TEXT
Comments
There are no comments yet.