Nonparametric estimation of a multivariate density under Kullback-Leibler loss with ISDE

05/06/2022
by   Louis Pujol, et al.
0

In this paper, we propose a theoretical analysis of the algorithm ISDE, introduced in previous work. From a dataset, ISDE learns a density written as a product of marginal density estimators over a partition of the features. We show that under some hypotheses, the Kullback-Leibler loss between the proper density and the output of ISDE is a bias term plus the sum of two terms which goes to zero as the number of samples goes to infinity. The rate of convergence indicates that ISDE tackles the curse of dimensionality by reducing the dimension from the one of the ambient space to the one of the biggest blocks in the partition. The constants reflect a combinatorial complexity reduction linked to the design of ISDE.

READ FULL TEXT

page 12

page 22

page 23

page 28

page 34

page 37

research
10/20/2021

A Note on Consistency of the Bayes Estimator of the Density

Under mild conditions, it is shown the strong consistency of the Bayes e...
research
06/16/2022

The convergent Indian buffet process

We propose a new Bayesian nonparametric prior for latent feature models,...
research
06/03/2020

Asymptotics of Lower Dimensional Zero-Density Regions

Topological data analysis (TDA) allows us to explore the topological fea...
research
12/21/2018

Local Estimation of a Multivariate Density and its Derivatives

We present methods for estimating the multivariate probability density (...
research
12/16/2022

Asymptotic Behaviour of Stepwise FWER-controlling Procedures

Familywise error rate (FWER) has been one of the most prominent frequent...

Please sign up or login with your details

Forgot password? Click here to reset