Block bootstrap optimality for density estimation with dependent data
Accurate approximation of the sampling distribution of nonparametric kernel density estimators is crucial for many statistical inference problems. Since these estimators have complex asymptotic distributions, bootstrap methods are often used for this purpose. With i.i.d. observations, a large literature exists concerning optimal bootstrap methods which achieve the fastest possible convergence rate of the bootstrap estimator of the sampling distribution of the kernel density estimator. With dependent data, such an optimality theory is an important open problem. We establish a general theory of optimality of the block bootstrap for kernel density estimation under weak dependence assumptions which are satisfied by many important time series models. We propose a unified framework for a theoretical study of a rich class of bootstrap methods which include as special cases subsampling, Kunsch's moving block bootstrap, Hall's under-smoothing (UNS) as well as approaches incorporating no (NBC) or explicit bias correction (EBC). Moreover, we consider their accuracy under a broad spectrum of choices of the bandwidth h, which include as an important special case the MSE-optimal choice, as well as other under-smoothed choices. Under each choice of h, we derive the optimal tuning parameters and compare optimal performances between the main subclasses (EBC, NBC, UNS) of the bootstrap methods.
READ FULL TEXT