CCMI : Classifier based Conditional Mutual Information Estimation

06/05/2019
by   Sudipto Mukherjee, et al.
0

Conditional Mutual Information (CMI) is a measure of conditional dependence between random variables X and Y, given another random variable Z. It can be used to quantify conditional dependence among variables in many data-driven inference problems such as graphical models, causal learning, feature selection and time-series analysis. While k-nearest neighbor (kNN) based estimators as well as kernel-based methods have been widely used for CMI estimation, they suffer severely from the curse of dimensionality. In this paper, we leverage advances in classifiers and generative models to design methods for CMI estimation. Specifically, we introduce an estimator for KL-Divergence based on the likelihood ratio by training a classifier to distinguish the observed joint distribution from the product distribution. We then show how to construct several CMI estimators using this basic divergence estimator by drawing ideas from conditional generative models. We demonstrate that the estimates from our proposed approaches do not degrade in performance with increasing dimension and obtain significant improvement over the widely used KSG estimator. Finally, as an application of accurate CMI estimation, we use our best estimator for conditional independence testing and achieve superior performance than the state-of-the-art tester on both simulated and real data-sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2017

Potential Conditional Mutual Information: Estimators, Properties and Applications

The conditional mutual information I(X;Y|Z) measures the average informa...
research
05/17/2020

C-MI-GAN : Estimation of Conditional Mutual Information using MinMax formulation

Estimation of information theoretic quantities such as mutual informatio...
research
04/09/2023

Nearest-Neighbor Sampling Based Conditional Independence Testing

The conditional randomization test (CRT) was recently proposed to test w...
research
12/06/2019

Conditional Mutual Information Estimation for Mixed Discrete and Continuous Variables with Nearest Neighbors

Fields like public health, public policy, and social science often want ...
research
11/12/2019

Model-Augmented Nearest-Neighbor Estimation of Conditional Mutual Information for Feature Selection

Markov blanket feature selection, while theoretically optimal, generally...
research
02/10/2016

Conditional Dependence via Shannon Capacity: Axioms, Estimators and Applications

We conduct an axiomatic study of the problem of estimating the strength ...
research
06/25/2018

Mimic and Classify : A meta-algorithm for Conditional Independence Testing

Given independent samples generated from the joint distribution p(x,y,z)...

Please sign up or login with your details

Forgot password? Click here to reset