Data-Efficient Mutual Information Neural Estimator

05/08/2019
by   Xiao Lin, et al.
0

Measuring Mutual Information (MI) between high-dimensional, continuous, random variables from observed samples has wide theoretical and practical applications. While traditional MI methods, such as (Kraskov et al. 2004), capable of capturing MI between low-dimensional signals, they fall short when dimensionality increases and are not scalable. Existing neural approaches, such as MINE (Belghazi et al. 2018), searches for a d-dimensional neural network that maximizes a variational lower bound for mutual information estimation; however, this requires O(d log d) observed samples to prevent the neural network from overfitting. For practical mutual information estimation in real world applications, data is not always available at a surplus, especially in cases where acquisition of the data is prohibitively expensive, for example in fMRI analysis. We introduce a scalable, data-efficient mutual information estimator. By coupling a learning-based view of the MI lower bound with meta-learning, DEMINE achieves high-confidence estimations irrespective of network size and with improved accuracy at practical dataset sizes. We demonstrate the effectiveness of DEMINE on synthetic benchmarks as well as a real world application of fMRI inter-subject correlation analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2020

Regularized Mutual Information Neural Estimation

With the variational lower bound of mutual information (MI), the estimat...
research
01/12/2018

MINE: Mutual Information Neural Estimation

We argue that the estimation of the mutual information between high dime...
research
06/07/2023

Hardness of Deceptive Certificate Selection

Recent progress towards theoretical interpretability guarantees for AI h...
research
08/09/2021

A Bayesian Nonparametric Estimation of Mutual Information

Mutual information is a widely-used information theoretic measure to qua...
research
06/22/2020

CLUB: A Contrastive Log-ratio Upper Bound of Mutual Information

Mutual information (MI) minimization has gained considerable interests i...
research
07/02/2021

Minimizing couplings in renormalization by preserving short-range mutual information

The connections between renormalization in statistical mechanics and inf...
research
03/14/2021

A Hybrid Gradient Method to Designing Bayesian Experiments for Implicit Models

Bayesian experimental design (BED) aims at designing an experiment to ma...

Please sign up or login with your details

Forgot password? Click here to reset