Exponential Concentration of a Density Functional Estimator

03/28/2016
by   Shashank Singh, et al.
0

We analyze a plug-in estimator for a large class of integral functionals of one or more continuous probability densities. This class includes important families of entropy, divergence, mutual information, and their conditional versions. For densities on the d-dimensional unit cube [0,1]^d that lie in a β-Hölder smoothness class, we prove our estimator converges at the rate O ( n^-β/β + d). Furthermore, we prove the estimator is exponentially concentrated about its mean, whereas most previous related results have proven only expected error bounds on estimators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2018

Analysis of KNN Information Estimators for Smooth Distributions

KSG mutual information estimator, which is based on the distances of eac...
research
02/12/2014

Nonparametric Estimation of Renyi Divergence and Friends

We consider nonparametric estimation of L_2, Renyi-α and Tsallis-α diver...
research
12/20/2012

SMML estimators for 1-dimensional continuous data

A method is given for calculating the strict minimum message length (SMM...
research
01/26/2018

Average values of functionals and concentration without measure

Although there doesn't exist the Lebesgue measure in the ball M of C[0,1...
research
03/28/2016

Generalized Exponential Concentration Inequality for Rényi Divergence Estimation

Estimating divergences in a consistent way is of great importance in man...
research
09/24/2022

Universal Densities Exist for Every Finite Reference Measure

As it is known, universal codes, which estimate the entropy rate consist...
research
02/25/2021

Inductive Mutual Information Estimation: A Convex Maximum-Entropy Copula Approach

We propose a novel estimator of the mutual information between two ordin...

Please sign up or login with your details

Forgot password? Click here to reset