Improving Lasso for model selection and prediction

07/05/2019
by   Piotr Pokarowski, et al.
0

It is known that the Thresholded Lasso (TL), SCAD or MCP correct intrinsic estimation bias of the Lasso. In this paper we propose an alternative method of improving the Lasso for predictive models with general convex loss functions which encompass normal linear models, logistic regression, quantile regression or support vector machines. For a given penalty we order the absolute values of the Lasso non-zero coefficients and then select the final model from a small nested family by the Generalized Information Criterion. We derive exponential upper bounds on the selection error of the method. These results confirm that, at least for normal linear models, our algorithm seems to be the benchmark for the theory of model selection as it is constructive, computationally efficient and leads to consistent model selection under weak assumptions. Constructivity of the algorithm means that, in contrast to the TL, SCAD or MCP, consistent selection does not rely on the unknown parameters as the cone invertibility factor. Instead, our algorithm only needs the sample size, the number of predictors and an upper bound on the noise parameter. We show in numerical experiments on synthetic and real-world data sets that an implementation of our algorithm is more accurate than implementations of studied concave regularizations. Our procedure is contained in the R package "DMRnet" and available on the CRAN repository.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2011

Estimation And Selection Via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications

The ℓ_1-penalized method, or the Lasso, has emerged as an important tool...
research
12/02/2018

Model Selection and estimation of Multi Screen Penalty

We propose a multi-step method, called Multi Screen Penalty (MSP), to es...
research
04/24/2023

Sparse Private LASSO Logistic Regression

LASSO regularized logistic regression is particularly useful for its bui...
research
06/14/2023

The generalized hyperbolic family and automatic model selection through the multiple-choice LASSO

We revisit the generalized hyperbolic (GH) distribution and its nested m...
research
08/22/2018

On an improvement of LASSO by scaling

A sparse modeling is a major topic in machine learning and statistics. L...
research
03/24/2016

Pathway Lasso: Estimate and Select Sparse Mediation Pathways with High Dimensional Mediators

In many scientific studies, it becomes increasingly important to delinea...
research
05/06/2022

What Makes A Good Fisherman? Linear Regression under Self-Selection Bias

In the classical setting of self-selection, the goal is to learn k model...

Please sign up or login with your details

Forgot password? Click here to reset