DeepAI AI Chat
Log In Sign Up

A Cheap Bootstrap Method for Fast Inference

by   Henry Lam, et al.

The bootstrap is a versatile inference method that has proven powerful in many statistical problems. However, when applied to modern large-scale models, it could face substantial computation demand from repeated data resampling and model fitting. We present a bootstrap methodology that uses minimal computation, namely with a resample effort as low as one Monte Carlo replication, while maintaining desirable statistical guarantees. We present the theory of this method that uses a twisted perspective from the standard bootstrap principle. We also present generalizations of this method to nested sampling problems and to a range of subsampling variants, and illustrate how it can be used for fast inference across different estimation problems.


page 1

page 2

page 3

page 4


Finite-Sample Coverage Errors of the Cheap Bootstrap With Minimal Resampling Effort

The bootstrap is a popular data-driven method to quantify statistical un...

Estimation and Inference by Stochastic Optimization: Three Examples

This paper illustrates two algorithms designed in Forneron Ng (2020)...

Simultaneous Inference for Massive Data: Distributed Bootstrap

In this paper, we propose a bootstrap method applied to massive data pro...

Robust, scalable and fast bootstrap method for analyzing large scale data

In this paper we address the problem of performing statistical inference...

A Scalable Bootstrap for Massive Data

The bootstrap provides a simple and powerful means of assessing the qual...

Inference by Stochastic Optimization: A Free-Lunch Bootstrap

Assessing sampling uncertainty in extremum estimation can be challenging...

Optimal Subsampling Bootstrap for Massive Data

The bootstrap is a widely used procedure for statistical inference becau...