The Nearest Neighbor Information Estimator is Adaptively Near Minimax Rate-Optimal

11/23/2017
by   Jiantao Jiao, et al.
0

We analyze the Kozachenko--Leonenko (KL) nearest neighbor estimator for the differential entropy. We obtain the first uniform upper bound on its performance over Hölder balls on a torus without assuming any conditions on how close the density could be from zero. Accompanying a new minimax lower bound over the Hölder ball, we show that the KL estimator is achieving the minimax rates up to logarithmic factors without cognizance of the smoothness parameter s of the Hölder ball for s∈ (0,2] and arbitrary dimension d, rendering it the first estimator that provably satisfies this property.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/26/2020

Minimax Optimal Estimation of KL Divergence for Continuous Distributions

Estimating Kullback-Leibler divergence from identical and independently ...
research
02/05/2022

One-Nearest-Neighbor Search is All You Need for Minimax Optimal Regression and Classification

Recently, Qiao, Duan, and Cheng (2019) proposed a distributed nearest-ne...
research
04/11/2016

Demystifying Fixed k-Nearest Neighbor Information Estimators

Estimating mutual information from i.i.d. samples drawn from an unknown ...
research
10/22/2019

Minimax Rate Optimal Adaptive Nearest Neighbor Classification and Regression

k Nearest Neighbor (kNN) method is a simple and popular statistical meth...
research
06/02/2019

On Testing for Parameters in Ising Models

We consider testing for the parameters of Ferromagnetic Ising models. Wh...
research
02/25/2021

On the consistency of the Kozachenko-Leonenko entropy estimate

We revisit the problem of the estimation of the differential entropy H(f...
research
08/03/2023

Minimax Optimal Q Learning with Nearest Neighbors

Q learning is a popular model free reinforcement learning method. Most o...

Please sign up or login with your details

Forgot password? Click here to reset