Approximate Cross-Validation with Low-Rank Data in High Dimensions

08/24/2020
by   William T. Stephenson, et al.
23

Many recent advances in machine learning are driven by a challenging trifecta: large data size N; high dimensions; and expensive algorithms. In this setting, cross-validation (CV) serves as an important tool for model assessment. Recent advances in approximate cross validation (ACV) provide accurate approximations to CV with only a single model fit, avoiding traditional CV's requirement for repeated runs of expensive algorithms. Unfortunately, these ACV methods can lose both speed and accuracy in high dimensions – unless sparsity structure is present in the data. Fortunately, there is an alternative type of simplifying structure that is present in most data: approximate low rank (ALR). Guided by this observation, we develop a new algorithm for ACV that is fast and accurate in the presence of ALR data. Our first key insight is that the Hessian matrix – whose inverse forms the computational bottleneck of existing ACV methods – is ALR. We show that, despite our use of the inverse Hessian, a low-rank approximation using the largest (rather than the smallest) matrix eigenvalues enables fast, reliable ACV. Our second key insight is that, in the presence of ALR data, error in existing ACV methods roughly grows with the (approximate, low) rank rather than with the (full, high) dimension. These insights allow us to prove theoretical guarantees on the quality of our proposed algorithm – along with fast-to-compute upper bounds on its error. We demonstrate the speed and accuracy of our method, as well as the usefulness of our bounds, on a range of real and simulated data sets.

READ FULL TEXT
research
04/30/2013

A least-squares method for sparse low rank approximation of multivariate functions

In this paper, we propose a low-rank approximation method based on discr...
research
06/23/2017

On the numerical rank of radial basis function kernels in high dimension

Low-rank approximations are popular methods to reduce the high computati...
research
06/23/2020

Approximate Cross-Validation for Structured Models

Many modern data analyses benefit from explicitly modeling dependence st...
research
01/22/2020

Rank Bounds for Approximating Gaussian Densities in the Tensor-Train Format

Low rank tensor approximations have been employed successfully, for exam...
research
06/01/2018

Return of the Infinitesimal Jackknife

The error or variability of machine learning algorithms is often assesse...
research
05/31/2019

Sparse Approximate Cross-Validation for High-Dimensional GLMs

Leave-one-out cross validation (LOOCV) can be particularly accurate amon...
research
02/17/2022

A Faster Interior-Point Method for Sum-of-Squares Optimization

We present a faster interior-point method for optimizing sum-of-squares ...

Please sign up or login with your details

Forgot password? Click here to reset