Minimax Optimal Estimation of KL Divergence for Continuous Distributions

02/26/2020
by   Puning Zhao, et al.
0

Estimating Kullback-Leibler divergence from identical and independently distributed samples is an important problem in various domains. One simple and effective estimator is based on the k nearest neighbor distances between these samples. In this paper, we analyze the convergence rates of the bias and variance of this estimator. Furthermore, we derive a lower bound of the minimax mean square error and show that kNN method is asymptotically rate optimal.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/23/2017

The Nearest Neighbor Information Estimator is Adaptively Near Minimax Rate-Optimal

We analyze the Kozachenko--Leonenko (KL) nearest neighbor estimator for ...
research
06/29/2019

A New Lower Bound for Kullback-Leibler Divergence Based on Hammersley-Chapman-Robbins Bound

In this paper, we derive a useful lower bound for the Kullback-Leibler d...
research
04/26/2018

Tight MMSE Bounds for the AGN Channel Under KL Divergence Constraints on the Input Distribution

Tight bounds on the minimum mean square error for the additive Gaussian ...
research
02/17/2017

Direct Estimation of Information Divergence Using Nearest Neighbor Ratios

We propose a direct estimation method for Rényi and f-divergence measure...
research
01/12/2018

Minimax Optimal Additive Functional Estimation with Discrete Distribution: Slow Divergence Speed Case

This paper addresses an estimation problem of an additive functional of ...
research
05/26/2023

Robust Nonparametric Regression under Poisoning Attack

This paper studies robust nonparametric regression, in which an adversar...
research
08/03/2023

Minimax Optimal Q Learning with Nearest Neighbors

Q learning is a popular model free reinforcement learning method. Most o...

Please sign up or login with your details

Forgot password? Click here to reset