Nonparametric independence testing via mutual information

11/17/2017
by   Thomas B. Berrett, et al.
0

We propose a test of independence of two multivariate random vectors, given a sample from the underlying population. Our approach, which we call MINT, is based on the estimation of mutual information, whose decomposition into joint and marginal entropies facilitates the use of recently-developed efficient entropy estimators derived from nearest neighbour distances. The proposed critical values, which may be obtained from simulation (in the case where one marginal is known) or resampling, guarantee that the test has nominal size, and we provide local power analyses, uniformly over classes of densities whose mutual information satisfies a lower bound. Our ideas may be extended to provide a new goodness-of-fit tests of normal linear models based on assessing the independence of our vector of covariates and an appropriately-defined notion of an error vector. The theory is supported by numerical studies on both simulated and real data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2020

A Test for Independence Via Bayesian Nonparametric Estimation of Mutual Information

Mutual information is a well-known tool to measure the mutual dependence...
research
05/05/2021

A nonparametric test of independence based on L_1-error

We propose a test of mutual independence between random vectors with arb...
research
07/19/2022

A Normal Test for Independence via Generalized Mutual Information

Testing hypothesis of independence between two random elements on a join...
research
11/02/2017

Geometric k-nearest neighbor estimation of entropy and mutual information

Like most nonparametric estimators of information functionals involving ...
research
10/27/2021

Data-Driven Representations for Testing Independence: Modeling, Analysis and Connection with Mutual Information Estimation

This work addresses testing the independence of two continuous and finit...
research
06/22/2023

Inferring the finest pattern of mutual independence from data

For a random variable X, we are interested in the blind extraction of it...
research
03/06/2018

Exact partial information decompositions for Gaussian systems based on dependency constraints

The Partial Information Decomposition (PID) [arXiv:1004.2515] provides a...

Please sign up or login with your details

Forgot password? Click here to reset