Measures of Entropy from Data Using Infinitely Divisible Kernels

11/11/2012
by   Luis G. Sanchez Giraldo, et al.
0

Information theory provides principled ways to analyze different inference and learning problems such as hypothesis testing, clustering, dimensionality reduction, classification, among others. However, the use of information theoretic quantities as test statistics, that is, as quantities obtained from empirical data, poses a challenging estimation problem that often leads to strong simplifications such as Gaussian models, or the use of plug in density estimators that are restricted to certain representation of the data. In this paper, a framework to non-parametrically obtain measures of entropy directly from data using operators in reproducing kernel Hilbert spaces defined by infinitely divisible kernels is presented. The entropy functionals, which bear resemblance with quantum entropies, are defined on positive definite matrices and satisfy similar axioms to those of Renyi's definition of entropy. Convergence of the proposed estimators follows from concentration results on the difference between the ordered spectrum of the Gram matrices and the integral operators associated to the population quantities. In this way, capitalizing on both the axiomatic definition of entropy and on the representation power of positive definite kernels, the proposed measure of entropy avoids the estimation of the probability distribution underlying the data. Moreover, estimators of kernel-based conditional entropy and mutual information are also defined. Numerical experiments on independence tests compare favourably with state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2021

The Representation Jensen-Rényi Divergence

We introduce a divergence measure between data distributions based on op...
research
01/16/2013

Information Theoretic Learning with Infinitely Divisible Kernels

In this paper, we develop a framework for information theoretic learning...
research
09/24/2021

Estimating Rényi's α-Cross-Entropies in a Matrix-Based Way

Conventional information-theoretic quantities assume access to probabili...
research
01/10/2018

Approximation beats concentration? An approximation view on inference with smooth radial kernels

Positive definite kernels and their associated Reproducing Kernel Hilber...
research
06/03/2022

Angle Based Dependence Measures in Metric Spaces

In this article, we introduce a general framework of angle based indepen...
research
06/27/2022

Positive-definite parametrization of mixed quantum states with deep neural networks

We introduce the Gram-Hadamard Density Operator (GHDO), a new deep neura...
research
09/07/2009

Kernels for Measures Defined on the Gram Matrix of their Support

We present in this work a new family of kernels to compare positive meas...

Please sign up or login with your details

Forgot password? Click here to reset