Minimax Optimal Estimation of KL Divergence for Continuous Distributions

02/26/2020
by   Puning Zhao, et al.
0

Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

11/23/2017

The Nearest Neighbor Information Estimator is Adaptively Near Minimax Rate-Optimal

We analyze the Kozachenko--Leonenko (KL) nearest neighbor estimator for ...
06/29/2019

A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

In this paper, we derive a useful lower bound for the Kullback-Leibler d...
04/26/2018

Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution

Tight bounds on the minimum mean square error for the additive Gaussian ...
02/17/2017

Direct Estimation of Information Divergence Using Nearest Neighbor Ratios

We propose a direct estimation method for Rényi and f-divergence measure...
01/12/2018

Minimax Optimal Additive Functional Estimation with Discrete Distribution: Slow Divergence Speed Case

This paper addresses an estimation problem of an additive functional of ...
06/14/2021

Improving Bridge estimators via f-GAN

Bridge sampling is a powerful Monte Carlo method for estimating ratios o...
10/30/2014

On Estimating L_2^2 Divergence

We give a comprehensive theoretical characterization of a nonparametric ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.