Sparse Regression: Scalable algorithms and empirical performance

02/18/2019
by   Dimitris Bertsimas, et al.
0

In this paper, we review state-of-the-art methods for feature selection in statistics with an application-oriented eye. Indeed, sparsity is a valuable property and the profusion of research on the topic might have provided little guidance to practitioners. We demonstrate empirically how noise and correlation impact both the accuracy - the number of correct features selected - and the false detection - the number of incorrect features selected - for five methods: the cardinality-constrained formulation, its Boolean relaxation, ℓ_1 regularization and two methods with non-convex penalties. A cogent feature selection method is expected to exhibit a two-fold convergence, namely the accuracy and false detection rate should converge to 1 and 0 respectively, as the sample size increases. As a result, proper method should recover all and nothing but true features. Empirically, the integer optimization formulation and its Boolean relaxation are the closest to exhibit this two properties consistently in various regimes of noise and correlation. In addition, apart from the discrete optimization approach which requires a substantial, yet often affordable, computational time, all methods terminate in times comparable with the |glmnet| package for Lasso. We released code for methods that were not publicly implemented. Jointly considered, accuracy, false detection and computational time provide a comprehensive assessment of each feature selection method and shed light on alternatives to the Lasso-regularization which are not as popular in practice yet.

READ FULL TEXT
research
06/03/2011

Multi-stage Convex Relaxation for Feature Selection

A number of recent work studied the effectiveness of feature selection u...
research
10/08/2020

Robust Multi-class Feature Selection via l_2,0-Norm Regularization Minimization

Feature selection is an important data preprocessing in data mining and ...
research
12/06/2017

Sparsity Regularization and feature selection in large dimensional data

Feature selection has evolved to be an important step in several machine...
research
09/12/2022

Bilevel Optimization for Feature Selection in the Data-Driven Newsvendor Problem

We study the feature-based newsvendor problem, in which a decision-maker...
research
07/08/2022

ControlBurn: Nonlinear Feature Selection with Sparse Tree Ensembles

ControlBurn is a Python package to construct feature-sparse tree ensembl...
research
08/27/2019

Feature Gradients: Scalable Feature Selection via Discrete Relaxation

In this paper we introduce Feature Gradients, a gradient-based search al...
research
04/26/2018

A Feature Selection Method Based on Shapley Value to False Alarm Reduction in ICUs, A Genetic-Algorithm Approach

High false alarm rate in intensive care units (ICUs) has been identified...

Please sign up or login with your details

Forgot password? Click here to reset