Estimating mutual information in high dimensions via classification error

06/16/2016
by   Charles Y. Zheng, et al.
0

Multivariate pattern analyses approaches in neuroimaging are fundamentally concerned with investigating the quantity and type of information processed by various regions of the human brain; typically, estimates of classification accuracy are used to quantify information. While a extensive and powerful library of methods can be applied to train and assess classifiers, it is not always clear how to use the resulting measures of classification performance to draw scientific conclusions: e.g. for the purpose of evaluating redundancy between brain regions. An additional confound for interpreting classification performance is the dependence of the error rate on the number and choice of distinct classes obtained for the classification task. In contrast, mutual information is a quantity defined independently of the experimental design, and has ideal properties for comparative analyses. Unfortunately, estimating the mutual information based on observations becomes statistically infeasible in high dimensions without some kind of assumption or prior. In this paper, we construct a novel classification-based estimator of mutual information based on high-dimensional asymptotics. We show that in a particular limiting regime, the mutual information is an invertible function of the expected k-class Bayes error. While the theory is based on a large-sample, high-dimensional limit, we demonstrate through simulations that our proposed estimator has superior performance to the alternatives in problems of moderate dimensionality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/29/2019

Improved mutual information measure for classification and community detection

The information theoretic quantity known as mutual information finds wid...
research
12/20/2022

fastMI: a fast and consistent copula-based estimator of mutual information

As a fundamental concept in information theory, mutual information (MI) ...
research
01/12/2018

MINE: Mutual Information Neural Estimation

We argue that the estimation of the mutual information between high dime...
research
09/14/2023

Estimating mutual information for spike trains: a bird song example

Zebra finch are a model animal used in the study of audition. They are a...
research
06/02/2017

Information, Privacy and Stability in Adaptive Data Analysis

Traditional statistical theory assumes that the analysis to be performed...
research
01/21/2015

A Bayesian alternative to mutual information for the hierarchical clustering of dependent random variables

The use of mutual information as a similarity measure in agglomerative h...
research
10/11/2021

Sliced Mutual Information: A Scalable Measure of Statistical Dependence

Mutual information (MI) is a fundamental measure of statistical dependen...

Please sign up or login with your details

Forgot password? Click here to reset