Density estimation by Randomized Quasi-Monte Carlo

07/16/2018 ∙ by Amal Ben Abdellah, et al. ∙ 0

We consider the problem of estimating the density of a random variable X that can be sampled exactly by Monte Carlo (MC) simulation. We investigate the effectiveness of replacing MC by randomized quasi Monte Carlo (RQMC) to reduce the integrated variance (IV) and the mean integrated square error (MISE) for histograms and kernel density estimators (KDEs). We show both theoretically and empirically that RQMC estimators can achieve large IV and MISE reductions and even faster convergence rates than MC in some situations, while leaving the bias unchanged. Typically, RQMC provides a larger IV (and MISE) reduction with KDEs than with histograms. We also find that if RQMC is much more effective than MC to estimate the mean of X for a given application, it does not imply that it is much better than MC to estimate the density of X for the same application. Density estimation involves a well known bias-variance tradeoff in the choice of a bandwidth parameter h. RQMC improves the convergence at any h, although the gains diminish when h is reduced to control bias.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.