Minimax Optimal Estimation of KL Divergence for Continuous Distributions

02/26/2020
by   Puning Zhao, et al.
0

Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset