ET-Lasso: Efficient Tuning of Lasso for High-Dimensional Data

10/10/2018
by   Songshan Yang, et al.
0

The L1 regularization (Lasso) has proven to be a versatile tool to select relevant features and estimate the model coefficients simultaneously. Despite its popularity, it is very challenging to guarantee the feature selection consistency of Lasso. One way to improve the feature selection consistency is to select an ideal tuning parameter. Traditional tuning criteria mainly focus on minimizing the estimated prediction error or maximizing the posterior model probability, such as cross-validation and BIC, which may either be time-consuming or fail to control the false discovery rate (FDR) when the number of features is extremely large. The other way is to introduce pseudo-features to learn the importance of the original ones. Recently, the Knockoff filter is proposed to control the FDR when performing feature selection. However, its performance is sensitive to the choice of the expected FDR threshold. Motivated by these ideas, we propose a new method using pseudo-features to obtain an ideal tuning parameter. In particular, we present the Efficient Tuning of Lasso (ET-Lasso) to separate active and inactive features by adding permuted features as pseudo-features in linear models. The pseudo-features are constructed to be inactive by nature, which can be used to obtain a cutoff to select the tuning parameter that separates active and inactive features. Experimental studies on both simulations and real-world data applications are provided to show that ET-Lasso can effectively and efficiently select active features under a wide range of different scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2018

Safe Active Feature Selection for Sparse Learning

We present safe active incremental feature selection (SAIF) to scale up ...
research
08/04/2013

Risk-consistency of cross-validation with lasso-type procedures

The lasso and related sparsity inducing algorithms have been the target ...
research
03/10/2019

Lasso tuning through the flexible-weighted bootstrap

Regularized regression approaches such as the Lasso have been widely ado...
research
10/29/2020

Post-selection inference with HSIC-Lasso

Detecting influential features in complex (non-linear and/or high-dimens...
research
02/26/2019

Fused Lasso for Feature Selection using Structural Information

Feature selection has been proven a powerful preprocessing step for high...
research
06/02/2020

Feature-weighted elastic net: using "features of features" for better prediction

In some supervised learning settings, the practitioner might have additi...
research
12/14/2020

E2E-FS: An End-to-End Feature Selection Method for Neural Networks

Classic embedded feature selection algorithms are often divided in two l...

Please sign up or login with your details

Forgot password? Click here to reset