Generalized Exponential Concentration Inequality for Rényi Divergence Estimation

03/28/2016
by   Shashank Singh, et al.
0

Estimating divergences in a consistent way is of great importance in many machine learning tasks. Although this is a fundamental problem in nonparametric statistics, to the best of our knowledge there has been no finite sample exponential inequality convergence bound derived for any divergence estimators. The main contribution of our work is to provide such a bound for an estimator of Rényi-α divergence for a smooth Hölder class of densities on the d-dimensional unit cube [0, 1]^d. We also illustrate our theoretical results with a numerical experiment.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2014

On Estimating L_2^2 Divergence

We give a comprehensive theoretical characterization of a nonparametric ...
research
11/07/2014

Multivariate f-Divergence Estimation With Confidence

The problem of f-divergence estimation is important in the fields of mac...
research
02/12/2014

Nonparametric Estimation of Renyi Divergence and Friends

We consider nonparametric estimation of L_2, Renyi-α and Tsallis-α diver...
research
03/28/2016

Exponential Concentration of a Density Functional Estimator

We analyze a plug-in estimator for a large class of integral functionals...
research
07/07/2022

Exponential finite sample bounds for incomplete U-statistics

Incomplete U-statistics have been proposed to accelerate computation. Th...
research
11/15/2021

Fast adjoint differentiation of chaos

We devise a fast algorithm for the gradient of the long-time-average sta...
research
01/18/2022

Bregman Deviations of Generic Exponential Families

We revisit the method of mixture technique, also known as the Laplace me...

Please sign up or login with your details

Forgot password? Click here to reset