Rademacher upper bounds for cross-validation errors with an application to the lasso

07/30/2020
by   Ning Xu, et al.
0

We establish a general upper bound for K-fold cross-validation (K-CV) errors that can be adapted to many K-CV-based estimators and learning algorithms. Based on Rademacher complexity of the model and the Orlicz-Ψ_ν norm of the error process, the CV error upper bound applies to both light-tail and heavy-tail error distributions. We also extend the CV error upper bound to β-mixing data using the technique of independent blocking. We provide a Python package (CVbound, <https://github.com/isaac2math>) for computing the CV error upper bound in K-CV-based algorithms. Using the lasso as an example, we demonstrate in simulations that the upper bounds are tight and stable across different parameter settings and random seeds. As well as accurately bounding the CV errors for the lasso, the minimizer of the new upper bounds can be used as a criterion for variable selection. Compared with the CV-error minimizer, simulations show that tuning the lasso penalty parameter according to the minimizer of the upper bound yields a more sparse and more stable model that retains all of the relevant variables.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset