Monte Carlo and Quasi-Monte Carlo Density Estimation via Conditioning

06/11/2019 ∙ by Pierre L'Ecuyer, et al. ∙ 0

Estimating the unknown density from which a given independent sample originates is more difficult than estimating the mean, in the sense that for the best popular density estimators, the mean integrated square error converges slower than at the canonical rate of O(1/n). When the sample is generated from a simulation model and we have control over how this is done, we can do better. We study an approach in which conditional Monte Carlo permits one to obtain a smooth estimator of the cumulative distribution function, whose sample derivative is an unbiased estimator of the density at any point, and therefore converges at a faster rate than the usual density estimators, under mild conditions. By combining this with randomized quasi-Monte Carlo to generate the sample, we can achieve an even faster rate.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.