Influence of single observations on the choice of the penalty parameter in ridge regression

11/09/2019
by   Kristoffer H. Hellton, et al.
0

Penalized regression methods, such as ridge regression, heavily rely on the choice of a tuning, or penalty, parameter, which is often computed via cross-validation. Discrepancies in the value of the penalty parameter may lead to substantial differences in regression coefficient estimates and predictions. In this paper, we investigate the effect of single observations on the optimal choice of the tuning parameter, showing how the presence of influential points can dramatically change it. We distinguish between points as "expanders" and "shrinkers", based on their effect on the model complexity. Our approach supplies a visual exploratory tool to identify influential points, naturally implementable for high-dimensional data where traditional approaches usually fail. Applications to real data examples, both low- and high-dimensional, and a simulation study are presented.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2019

A Survey of Tuning Parameter Selection for High-dimensional Regression

Penalized (or regularized) regression, as represented by Lasso and its v...
research
01/27/2020

Penalized angular regression for personalized predictions

Personalization is becoming an important feature in many predictive appl...
research
05/19/2020

Fast cross-validation for multi-penalty ridge regression

Prediction based on multiple high-dimensional data types needs to accoun...
research
06/06/2017

Shape Parameter Estimation

Performance of machine learning approaches depends strongly on the choic...
research
09/17/2021

Adaptive Ridge-Penalized Functional Local Linear Regression

We introduce an original method of multidimensional ridge penalization i...
research
03/27/2016

Regularization Parameter Selection for a Bayesian Multi-Level Group Lasso Regression Model with Application to Imaging Genomics

We investigate the choice of tuning parameters for a Bayesian multi-leve...
research
05/11/2016

High dimensional thresholded regression and shrinkage effect

High-dimensional sparse modeling via regularization provides a powerful ...

Please sign up or login with your details

Forgot password? Click here to reset