DeepAI
Log In Sign Up

Understanding the Limitations of Variational Mutual Information Estimators

10/14/2019
by   Jiaming Song, et al.
54

Variational approaches based on neural networks are showing promise for estimating mutual information (MI) between high dimensional variables. However, they can be difficult to use in practice due to poorly understood bias/variance tradeoffs. We theoretically show that, under some conditions, estimators such as MINE exhibit variance that could grow exponentially with the true amount of underlying MI. We also empirically demonstrate that existing estimators fail to satisfy basic self-consistency properties of MI, such as data processing and additivity under independence. Based on a unified perspective of variational approaches, we develop a new estimator that focuses on variance reduction. Empirical results on standard benchmark tasks demonstrate that our proposed estimator exhibits improved bias-variance trade-offs on standard benchmark tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/05/2020

DEMI: Discriminative Estimator of Mutual Information

Estimating mutual information between continuous random variables is oft...
05/16/2019

On Variational Bounds of Mutual Information

Estimating and optimizing Mutual Information (MI) is core to many proble...
11/18/2021

On Generalized Schürmann Entropy Estimators

We present a new class of estimators of Shannon entropy for severely und...
06/09/2022

On the Bias-Variance Characteristics of LIME and SHAP in High Sparsity Movie Recommendation Explanation Tasks

We evaluate two popular local explainability techniques, LIME and SHAP, ...
12/20/2022

fastMI: a fast and consistent copula-based estimator of mutual information

As a fundamental concept in information theory, mutual information (MI) ...
04/30/2018

On the Effect of Suboptimal Estimation of Mutual Information in Feature Selection and Classification

This paper introduces a new property of estimators of the strength of st...
03/13/2019

Variational Estimators for Bayesian Optimal Experimental Design

Bayesian optimal experimental design (BOED) is a principled framework fo...

Code Repositories

smile-mi-estimator

PyTorch implementation for the ICLR 2020 paper "Understanding the Limitations of Variational Mutual Information Estimators"


view repo

Reverse-Jensen_MI_estimation

Estimation of Mutual Information based on a reverse Jensen inequality approach


view repo