Efficient algorithm to select tuning parameters in sparse regression modeling with regularization

09/12/2011
by   Kei Hirose, et al.
0

In sparse regression modeling via regularization such as the lasso, it is important to select appropriate values of tuning parameters including regularization parameters. The choice of tuning parameters can be viewed as a model selection and evaluation problem. Mallows' C_p type criteria may be used as a tuning parameter selection tool in lasso-type regularization methods, for which the concept of degrees of freedom plays a key role. In the present paper, we propose an efficient algorithm that computes the degrees of freedom by extending the generalized path seeking algorithm. Our procedure allows us to construct model selection criteria for evaluating models estimated by regularization with a wide variety of convex and non-convex penalties. Monte Carlo simulations demonstrate that our methodology performs well in various situations. A real data example is also given to illustrate our procedure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2012

Selection of tuning parameters in bridge regression models via Bayesian information criterion

We consider the bridge linear regression modeling, which can produce a s...
research
10/11/2021

yaglm: a Python package for fitting and tuning generalized linear models that supports structured, adaptive and non-convex penalties

The yaglm package aims to make the broader ecosystem of modern generaliz...
research
11/22/2019

On the use of information criteria for subset selection in least squares regression

Least squares (LS) based subset selection methods are popular in linear ...
research
10/11/2017

Adaptive multi-penalty regularization based on a generalized Lasso path

For many algorithms, parameter tuning remains a challenging and critical...
research
02/22/2011

Semi-supervised logistic discrimination for functional data

Multi-class classification methods based on both labeled and unlabeled f...
research
07/06/2022

Degrees of Freedom and Information Criteria for the Synthetic Control Method

We provide an analytical characterization of the model flexibility of th...
research
11/12/2013

When Does More Regularization Imply Fewer Degrees of Freedom? Sufficient Conditions and Counter Examples from Lasso and Ridge Regression

Regularization aims to improve prediction performance of a given statist...

Please sign up or login with your details

Forgot password? Click here to reset