Safe Screening for Logistic Regression with ℓ_0-ℓ_2 Regularization

02/01/2022
by   Anna Deza, et al.
0

In logistic regression, it is often desirable to utilize regularization to promote sparse solutions, particularly for problems with a large number of features compared to available labels. In this paper, we present screening rules that safely remove features from logistic regression with ℓ_0-ℓ_2 regularization before solving the problem. The proposed safe screening rules are based on lower bounds from the Fenchel dual of strong conic relaxations of the logistic regression problem. Numerical experiments with real and synthetic data suggest that a high percentage of the features can be effectively and safely removed apriori, leading to substantial speed-up in the computations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2013

A Safe Screening Rule for Sparse Logistic Regression

The l1-regularized logistic regression (or sparse logistic regression) i...
research
04/19/2020

Safe Screening Rules for ℓ_0-Regression

We give safe screening rules to eliminate variables from regression with...
research
02/27/2017

An Efficient Pseudo-likelihood Method for Sparse Binary Pairwise Markov Network Estimation

The pseudo-likelihood method is one of the most popular algorithms for l...
research
06/12/2019

(A) Data in the Life: Authorship Attribution of Lennon-McCartney Songs

The songwriting duo of John Lennon and Paul McCartney, the two founding ...
research
12/18/2020

Classification with Strategically Withheld Data

Machine learning techniques can be useful in applications such as credit...
research
11/14/2020

Cost-Sensitive Machine Learning Classification for Mass Tuberculosis Verbal Screening

Score-based algorithms for tuberculosis (TB) verbal screening perform po...
research
02/22/2021

Expanding boundaries of Gap Safe screening

Sparse optimization problems are ubiquitous in many fields such as stati...

Please sign up or login with your details

Forgot password? Click here to reset