Approximate cross-validation formula for Bayesian linear regression

10/25/2016
by   Yoshiyuki Kabashima, et al.
0

Cross-validation (CV) is a technique for evaluating the ability of statistical models/learning systems based on a given data set. Despite its wide applicability, the rather heavy computational cost can prevent its use as the system size grows. To resolve this difficulty in the case of Bayesian linear regression, we develop a formula for evaluating the leave-one-out CV error approximately without actually performing CV. The usefulness of the developed formula is tested by statistical mechanical analysis for a synthetic model. This is confirmed by application to a real-world supernova data set as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2017

Accelerating Cross-Validation in Multinomial Logistic Regression with ℓ_1-Regularization

We develop an approximate formula for evaluating a cross-validation esti...
research
02/27/2019

Cross validation in sparse linear regression with piecewise continuous nonconvex penalties and its acceleration

We investigate the signal reconstruction performance of sparse linear re...
research
06/12/2018

CID Models on Real-world Social Networks and GOF Measurements

Assessing the model fit quality of statistical models for network data i...
research
11/20/2020

Optimizing Approximate Leave-one-out Cross-validation to Tune Hyperparameters

For a large class of regularized models, leave-one-out cross-validation ...
research
06/12/2018

CID Models on Real-world Social Networks and Goodness of Fit Measurements

Assessing the model fit quality of statistical models for network data i...
research
12/11/2002

A Theory of Cross-Validation Error

This paper presents a theory of error in cross-validation testing of alg...

Please sign up or login with your details

Forgot password? Click here to reset