Optimizing Approximate Leave-one-out Cross-validation to Tune Hyperparameters

11/20/2020
by   Ryan Burn, et al.
0

For a large class of regularized models, leave-one-out cross-validation can be efficiently estimated with an approximate leave-one-out formula (ALO). We consider the problem of adjusting hyperparameters so as to optimize ALO. We derive efficient formulas to compute the gradient and hessian of ALO and show how to apply a second-order optimizer to find hyperparameters. We demonstrate the usefulness of the proposed approach by finding hyperparameters for regularized logistic regression and ridge regression on various real-world data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2017

Accelerating Cross-Validation in Multinomial Logistic Regression with ℓ_1-Regularization

We develop an approximate formula for evaluating a cross-validation esti...
research
10/24/2018

Leave-one-out cross-validation for non-factorizable normal models

Cross-validation can be used to measure a model's predictive accuracy fo...
research
10/25/2016

Approximate cross-validation formula for Bayesian linear regression

Cross-validation (CV) is a technique for evaluating the ability of stati...
research
04/06/2020

Online Hyperparameter Search Interleaved with Proximal Parameter Updates

There is a clear need for efficient algorithms to tune hyperparameters f...
research
06/26/2023

Gain Confidence, Reduce Disappointment: A New Approach to Cross-Validation for Sparse Regression

Ridge regularized sparse regression involves selecting a subset of featu...
research
03/27/2015

Bayesian Cross Validation and WAIC for Predictive Prior Design in Regular Asymptotic Theory

Prior design is one of the most important problems in both statistics an...
research
06/26/2012

Predictive Approaches For Gaussian Process Classifier Model Selection

In this paper we consider the problem of Gaussian process classifier (GP...

Please sign up or login with your details

Forgot password? Click here to reset