A Bayesian alternative to mutual information for the hierarchical clustering of dependent random variables

01/21/2015
by   Guillaume Marrelec, et al.
0

The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AHC procedure as a Bayesian model comparison. We found that the Bayesian formulation naturally shrinks the empirical covariance matrix towards a matrix set a priori (e.g., the identity), provides an automated stopping rule, and corrects for dimensionality using a term that scales up the measure as a function of the dimensionality of the variables. Also, the resulting log Bayes factor is asymptotically proportional to the plug-in estimate of mutual information, with an additive correction for dimensionality in agreement with the Bayesian information criterion. We investigated the behavior of these Bayesian alternatives (in exact and asymptotic forms) to mutual information on simulated and real data. An encouraging result was first derived on simulations: the hierarchical clustering based on the log Bayes factor outperformed off-the-shelf clustering techniques as well as raw and normalized mutual information in terms of classification accuracy. On a toy example, we found that the Bayesian approaches led to results that were similar to those of mutual information clustering techniques, with the advantage of an automated thresholding. On real functional magnetic resonance imaging (fMRI) datasets measuring brain activity, it identified clusters consistent with the established outcome of standard procedures. On this application, normalized mutual information had a highly atypical behavior, in the sense that it systematically favored very large clusters. These initial experiments suggest that the proposed Bayesian alternatives to mutual information are a useful new tool for hierarchical clustering.

READ FULL TEXT
research
05/04/2020

Renormalized Mutual Information for Extraction of Continuous Features

We derive a well-defined renormalized version of mutual information that...
research
02/10/2020

A Test for Independence Via Bayesian Nonparametric Estimation of Mutual Information

Mutual information is a well-known tool to measure the mutual dependence...
research
06/23/2023

Exact mutual information for lognormal random variables

Stochastic correlated observables with lognormal distribution are ubiqui...
research
08/07/2014

Robust Feature Selection by Mutual Information Distributions

Mutual information is widely used in artificial intelligence, in a descr...
research
06/29/2019

Kolmogorov's Algorithmic Mutual Information Is Equivalent to Bayes' Law

Given two events A and B, Bayes' law is based on the argument that the p...
research
06/16/2016

Estimating mutual information in high dimensions via classification error

Multivariate pattern analyses approaches in neuroimaging are fundamental...
research
09/04/2018

Pointwise HSIC: A Linear-Time Kernelized Co-occurrence Norm for Sparse Linguistic Expressions

In this paper, we propose a new kernel-based co-occurrence measure that ...

Please sign up or login with your details

Forgot password? Click here to reset