
Scalable HashBased Estimation of Divergence Measures
We propose a scalable divergence estimation method based on hashing. Con...
read it

Empirical estimation of entropy functionals with confidence
This paper introduces a class of knearest neighbor (kNN) estimators ca...
read it

Minimax Optimal Estimation of KL Divergence for Continuous Distributions
Estimating KullbackLeibler divergence from identical and independently ...
read it

Statistical estimation of the KullbackLeibler divergence
Wide conditions are provided to guarantee asymptotic unbiasedness and L^...
read it

Consistent recovery threshold of hidden nearest neighbor graphs
Motivated by applications such as discovering strong ties in social netw...
read it

Scalable Mutual Information Estimation using Dependence Graphs
We propose a unified method for empirical nonparametric estimation of g...
read it

Rateoptimal Meta Learning of Classification Error
Meta learning of optimal classifier error rates allows an experimenter t...
read it
Direct Estimation of Information Divergence Using Nearest Neighbor Ratios
We propose a direct estimation method for Rényi and fdivergence measures based on a new graph theoretical interpretation. Suppose that we are given two sample sets X and Y, respectively with N and M samples, where η:=M/N is a constant value. Considering the knearest neighbor (kNN) graph of Y in the joint data set (X,Y), we show that the average powered ratio of the number of X points to the number of Y points among all kNN points is proportional to Rényi divergence of X and Y densities. A similar method can also be used to estimate fdivergence measures. We derive bias and variance rates, and show that for the class of γHölder smooth functions, the estimator achieves the MSE rate of O(N^2γ/(γ+d)). Furthermore, by using a weighted ensemble estimation technique, for density functions with continuous and bounded derivatives of up to the order d, and some extra conditions at the support set boundary, we derive an ensemble estimator that achieves the parametric MSE rate of O(1/N). Our estimators are more computationally tractable than other competing estimators, which makes them appealing in many practical applications.
READ FULL TEXT
Comments
There are no comments yet.