Bias, Consistency, and Alternative Perspectives of the Infinitesimal Jackknife

06/10/2021
by   Wei Peng, et al.
0

Though introduced nearly 50 years ago, the infinitesimal jackknife (IJ) remains a popular modern tool for quantifying predictive uncertainty in complex estimation settings. In particular, when supervised learning ensembles are constructed via bootstrap samples, recent work demonstrated that the IJ estimate of variance is particularly convenient and useful. However, despite the algebraic simplicity of its final form, its derivation is rather complex. As a result, studies clarifying the intuition behind the estimator or rigorously investigating its properties have been severely lacking. This work aims to take a step forward on both fronts. We demonstrate that surprisingly, the exact form of the IJ estimator can be obtained via a straightforward linear regression of the individual bootstrap estimates on their respective weights or via the classical jackknife. The latter realization is particularly useful as it allows us to formally investigate the bias of the IJ variance estimator and better characterize the settings in which its use is appropriate. Finally, we extend these results to the case of U-statistics where base models are constructed via subsampling rather than bootstrapping and provide a consistent estimate of the resulting variance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/06/2022

Variance estimation for logistic regression in case-cohort studies

The logistic regression analysis proposed by Schouten et al. (Stat Med. ...
research
09/21/2022

Practical considerations for sandwich variance estimation in two-stage regression settings

We present a practical approach for computing the sandwich variance esti...
research
11/17/2021

Unbiased Risk Estimation in the Normal Means Problem via Coupled Bootstrap Techniques

We study a new method for estimating the risk of an arbitrary estimator ...
research
03/19/2019

Optimal Bias Correction of the Log-periodogram Estimator of the Fractional Parameter: A Jackknife Approach

We use the jackknife to bias correct the log-periodogram regression (LPR...
research
06/21/2022

Ensembling over Classifiers: a Bias-Variance Perspective

Ensembles are a straightforward, remarkably effective method for improvi...
research
12/01/2018

Quantifying the uncertainty of variance partitioning estimates of ecological datasets

An important objective of experimental biology is the quantification of ...
research
10/04/2022

SIMPLE: A Gradient Estimator for k-Subset Sampling

k-subset sampling is ubiquitous in machine learning, enabling regulariza...

Please sign up or login with your details

Forgot password? Click here to reset