Renormalized Mutual Information for Extraction of Continuous Features

05/04/2020
by   Leopoldo Sarra, et al.
0

We derive a well-defined renormalized version of mutual information that allows to estimate the dependence between continuous random variables in the important case when one is deterministically dependent on the other. This is the situation relevant for feature extraction and for information processing in artificial neural networks. We illustrate in basic examples how the renormalized mutual information can be used not only to compare the usefulness of different ansatz features, but also to automatically extract optimal features of a system in an unsupervised dimensionality reduction scenario.

READ FULL TEXT
research
12/27/2018

On mutual information estimation for mixed-pair random variables

We study the mutual information estimation for mixed-pair random variabl...
research
01/12/2018

MINE: Mutual Information Neural Estimation

We argue that the estimation of the mutual information between high dime...
research
08/07/2014

Robust Feature Selection by Mutual Information Distributions

Mutual information is widely used in artificial intelligence, in a descr...
research
05/14/2020

The Information Mutual Information Ratio for Counting Image Features and Their Matches

Feature extraction and description is an important topic of computer vis...
research
01/21/2015

A Bayesian alternative to mutual information for the hierarchical clustering of dependent random variables

The use of mutual information as a similarity measure in agglomerative h...
research
08/05/2021

M2IOSR: Maximal Mutual Information Open Set Recognition

In this work, we aim to address the challenging task of open set recogni...
research
10/19/2012

Sufficient Dimensionality Reduction with Irrelevant Statistics

The problem of finding a reduced dimensionality representation of catego...

Please sign up or login with your details

Forgot password? Click here to reset