Uniform Convergence Rate of the Kernel Density Estimator Adaptive to Intrinsic Dimension

10/13/2018
by   Jisu Kim, et al.
0

We derive concentration inequalities for the supremum norm of the difference between a kernel density estimator (KDE) and its point-wise expectation that hold uniformly over the selection of the bandwidth and under weaker conditions on the kernel than previously used in the literature. The derived bounds are adaptive to the intrinsic dimension of the underlying distribution. For instance, when the data-generating distribution has a Lebesgue density, our bound implies the same convergence rate as ones known in the literature. However, when the underlying distribution is supported over a lower dimensional set, our bounds depends explicitly on the intrinsic dimension of the support. Analogous bounds are derived for the derivative of the KDE, of any order. Our results are generally applicable but are especially useful for problems in geometric inference and topological data analysis, including level set estimation, density-based clustering, modal clustering and mode hunting, ridge estimation and persistent homology.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset