On optimal block resampling for Gaussian-subordinated long-range dependent processes
Block-based resampling estimators have been intensively investigated for weakly dependent time processes, which has helped to inform implementation (e.g., best block sizes). However, little is known about resampling performance and block sizes under strong or long-range dependence. To establish guideposts in block selection, we consider a broad class of strongly dependent time processes, formed by a transformation of a stationary long-memory Gaussian series, and examine block-based resampling estimators for the variance of the prototypical sample mean; extensions to general statistical functionals are also considered. Unlike weak dependence, the properties of resampling estimators under strong dependence are shown to depend intricately on the nature of non-linearity in the time series (beyond Hermite ranks) in addition the long-memory coefficient and block size. Additionally, the intuition has often been that optimal block sizes should be larger under strong dependence (say O(n^1/2) for a sample size n) than the optimal order O(n^1/3) known under weak dependence. This intuition turns out to be largely incorrect, though a block order O(n^1/2) may be reasonable (and even optimal) in many cases, owing to non-linearity in a long-memory time series. While optimal block sizes are more complex under long-range dependence compared to short-range, we provide a consistent data-driven rule for block selection, and numerical studies illustrate that the guides for block selection perform well in other block-based problems with long-memory time series, such as distribution estimation and strategies for testing Hermite rank.
READ FULL TEXT