DeepAI AI Chat
Log In Sign Up

On Variational Bounds of Mutual Information

05/16/2019
by   Ben Poole, et al.
8

Estimating and optimizing Mutual Information (MI) is core to many problems in machine learning; however, bounding MI in high dimensions is challenging. To establish tractable and scalable objectives, recent work has turned to variational bounds parameterized by neural networks, but the relationships and tradeoffs between these bounds remains unclear. In this work, we unify these recent developments in a single framework. We find that the existing variational lower bounds degrade when the MI is large, exhibiting either high bias or high variance. To address this problem, we introduce a continuum of lower bounds that encompasses previous bounds and flexibly trades off bias and variance. On high-dimensional, controlled problems, we empirically characterize the bias and variance of the bounds and their gradients and demonstrate the effectiveness of our new bounds for estimation and representation learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/05/2020

DEMI: Discriminative Estimator of Mutual Information

Estimating mutual information between continuous random variables is oft...
06/09/2020

Neural Methods for Point-wise Dependency Estimation

Since its inception, the neural estimation of mutual information (MI) ha...
05/30/2020

On lower bounds for the bias-variance trade-off

It is a common phenomenon that for high-dimensional and nonparametric st...
10/14/2019

Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...
07/08/2016

Lower Bounds on Active Learning for Graphical Model Selection

We consider the problem of estimating the underlying graph associated wi...
02/24/2022

Estimators of Entropy and Information via Inference in Probabilistic Models

Estimating information-theoretic quantities such as entropy and mutual i...

Code Repositories

CLUB

Code for ICML2020 paper - CLUB: A Contrastive Log-ratio Upper Bound of Mutual Information


view repo

doe

Difference-of-Entropies (DoE) Estimator


view repo

Reverse-Jensen_MI_estimation

Estimation of Mutual Information based on a reverse Jensen inequality approach


view repo

eth-atml-fall19

Presentation for the Advanced Topics in Machine Learning seminar, Fall 2019


view repo