On mutual information estimation for mixed-pair random variables

12/27/2018
by   Aleksandr Beknazaryan, et al.
0

We study the mutual information estimation for mixed-pair random variables. One random variable is discrete and the other one is continuous. We develop a kernel method to estimate the mutual information between the two random variables. The estimates enjoy a central limit theorem under some regular conditions on the distributions. The theoretical results are demonstrated by simulation study.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2019

Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

This paper proposes a new method for estimating the joint probability ma...
research
06/26/2023

A short proof of the Gács–Körner theorem

We present a short proof of a celebrated result of Gács and Körner givin...
research
06/23/2021

A partial information decomposition for discrete and continuous variables

Conceptually, partial information decomposition (PID) is concerned with ...
research
05/04/2020

Renormalized Mutual Information for Extraction of Continuous Features

We derive a well-defined renormalized version of mutual information that...
research
12/06/2019

Conditional Mutual Information Estimation for Mixed Discrete and Continuous Variables with Nearest Neighbors

Fields like public health, public policy, and social science often want ...
research
06/23/2023

Exact mutual information for lognormal random variables

Stochastic correlated observables with lognormal distribution are ubiqui...
research
11/20/2022

Diffeomorphic Information Neural Estimation

Mutual Information (MI) and Conditional Mutual Information (CMI) are mul...

Please sign up or login with your details

Forgot password? Click here to reset