Cross validation residuals for generalised least squares and other correlated data models

09/05/2018
by   Ingrid Annette Baade, et al.
0

Cross validation residuals are well known for the ordinary least squares model. Here leave-M-out cross validation is extended to generalised least squares. The relationship between cross validation residuals and Cook's distance is demonstrated, in terms of an approximation to the difference in the generalised residual sum of squares for a model fit to all the data (training and test) and a model fit to a reduced dataset (training data only). For generalised least squares, as for ordinary least squares, there is no need to refit the model to reduced size datasets as all the values for K fold cross validation are available after fitting the model to all the data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2016

A note on adjusting R^2 for using with cross-validation

We show how to adjust the coefficient of determination (R^2) when used f...
research
03/16/2023

Cross-validatory Z-Residual for Diagnosing Shared Frailty Models

Residual diagnostic methods play a critical role in assessing model assu...
research
03/27/2018

Cross-validation in high-dimensional spaces: a lifeline for least-squares models and multi-class LDA

Least-squares models such as linear regression and Linear Discriminant A...
research
04/04/2019

Cross-Validation for Correlated Data

K-fold cross-validation (CV) with squared error loss is widely used for ...
research
06/23/2020

Approximate Cross-Validation for Structured Models

Many modern data analyses benefit from explicitly modeling dependence st...
research
01/15/2021

Fitting very flexible models: Linear regression with large numbers of parameters

There are many uses for linear fitting; the context here is interpolatio...

Please sign up or login with your details

Forgot password? Click here to reset