Sparse Approximate Cross-Validation for High-Dimensional GLMs

05/31/2019
by   William Stephenson, et al.
3

Leave-one-out cross validation (LOOCV) can be particularly accurate among CV variants for estimating out-of-sample error. Unfortunately, LOOCV requires re-fitting a model N times for a dataset of size N. To avoid this prohibitive computational expense, a number of authors have proposed approximations to LOOCV. These approximations work well when the unknown parameter is of small, fixed dimension but suffer in high dimensions; they incur a running time roughly cubic in the dimension, and, in fact, we show their accuracy significantly deteriorates in high dimensions. We demonstrate that these difficulties can be surmounted in ℓ_1-regularized generalized linear models when we assume that the unknown parameter, while high dimensional, has a small support. In particular, we show that, under interpretable conditions, the support of the recovered parameter does not change as each datapoint is left out. This result implies that the previously proposed heuristic of only approximating CV along the support of the recovered parameter has running time and error that scale with the (small) support size even when the full dimension is large. Experiments on synthetic and real data support the accuracy of our approximations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2020

Error bounds in estimating the out-of-sample prediction error using leave-one-out cross validation in high-dimensions

We study the problem of out-of-sample risk estimation in the high dimens...
research
05/17/2019

LR-GLM: High-Dimensional Bayesian Inference Using Low-Rank Data Approximations

Due to the ease of modern data collection, applied statisticians often h...
research
11/09/2022

On High-Dimensional Gaussian Comparisons For Cross-Validation

We derive high-dimensional Gaussian comparison results for the standard ...
research
08/20/2018

On the error in Laplace approximations of high-dimensional integrals

Laplace approximations are commonly used to approximate high-dimensional...
research
09/19/2022

Robust leave-one-out cross-validation for high-dimensional Bayesian models

Leave-one-out cross-validation (LOO-CV) is a popular method for estimati...
research
08/24/2020

Approximate Cross-Validation with Low-Rank Data in High Dimensions

Many recent advances in machine learning are driven by a challenging tri...
research
07/07/2018

Approximate Leave-One-Out for Fast Parameter Tuning in High Dimensions

Consider the following class of learning schemes: β̂ := _β ∑_j=1^n ℓ(x_j...

Please sign up or login with your details

Forgot password? Click here to reset