Multivariate Extension of Matrix-based Renyi's α-order Entropy Functional

08/23/2018
by   Shujian Yu, et al.
0

The matrix-based Renyi's α-order entropy functional was recently introduced using the normalized eigenspectrum of an Hermitian matrix of the projected data in the reproducing kernel Hilbert space (RKHS). However, the current theory in the matrix-based Renyi's α-order entropy functional only defines the entropy of a single variable or mutual information between two random variables. In information theory and machine learning communities, one is also frequently interested in multivariate information quantities, such as the multivariate joint entropy and different interactive quantities among multiple variables. In this paper, we first define the matrix-based Renyi's α-order joint entropy among multiple variables. We then show how this definition can ease the estimation of various information quantities that measure the interactions among multiple variables, such as interactive information and total correlation. We finally present an application to feature selection to show how our definition provides a simple yet powerful way to estimate a widely-acknowledged intractable quantity from data. A real example on hyperspectral image (HSI) band selection is also provided.

READ FULL TEXT

page 23

page 24

page 25

page 26

research
01/19/2023

DiME: Maximizing Mutual Information by a Difference of Matrix-Based Entropies

We introduce an information-theoretic quantity with similar properties t...
research
10/26/2018

Estimators for Multivariate Information Measures in General Probability Spaces

Information theoretic quantities play an important role in various setti...
research
07/23/2021

On shared and multiple information

We address three outstanding problems in information theory. Problem one...
research
09/14/2023

Causal Entropy and Information Gain for Measuring Causal Control

Artificial intelligence models and methods commonly lack causal interpre...
research
01/16/2013

Information Theoretic Learning with Infinitely Divisible Kernels

In this paper, we develop a framework for information theoretic learning...
research
05/12/2023

A Logarithmic Decomposition for Information

The Shannon entropy of a random variable X has much behaviour analogous ...
research
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...

Please sign up or login with your details

Forgot password? Click here to reset