Log In Sign Up

Inference by Stochastic Optimization: A Free-Lunch Bootstrap

by   Jean-Jacques Forneron, et al.

Assessing sampling uncertainty in extremum estimation can be challenging when the asymptotic variance is not analytically tractable. Bootstrap inference offers a feasible solution but can be computationally costly especially when the model is complex. This paper uses iterates of a specially designed stochastic optimization algorithm as draws from which both point estimates and bootstrap standard errors can be computed in a single run. The draws are generated by the gradient and Hessian computed from batches of data that are resampled at each iteration. We show that these draws yield consistent estimates and asymptotically valid frequentist inference for a large class of regular problems. The algorithm provides accurate standard errors in simulation examples and empirical applications at low computational costs. The draws from the algorithm also provide a convenient way to detect data irregularities.


Estimation and Inference by Stochastic Optimization

In non-linear estimations, it is common to assess sampling uncertainty b...

Estimation and Inference by Stochastic Optimization: Three Examples

This paper illustrates two algorithms designed in Forneron Ng (2020)...

On the validity of bootstrap uncertainty estimates in the Mallows-Binomial model

The Mallows-Binomial distribution is the first joint statistical model f...

Confidence Intervals for Random Forests: The Jackknife and the Infinitesimal Jackknife

We study the variability of predictions made by bagged learners and rand...

A Dual Control Variate for doubly stochastic optimization and black-box variational inference

In this paper, we aim at reducing the variance of doubly stochastic opti...

Bounding Optimality Gap in Stochastic Optimization via Bagging: Statistical Efficiency and Stability

We study a statistical method to estimate the optimal value, and the opt...