DeepAI AI Chat
Log In Sign Up

Expanding boundaries of Gap Safe screening

by   Cassio Dantas, et al.

Sparse optimization problems are ubiquitous in many fields such as statistics, signal/image processing and machine learning. This has led to the birth of many iterative algorithms to solve them. A powerful strategy to boost the performance of these algorithms is known as safe screening: it allows the early identification of zero coordinates in the solution, which can then be eliminated to reduce the problem's size and accelerate convergence. In this work, we extend the existing Gap Safe screening framework by relaxing the global strong-concavity assumption on the dual cost function. Instead, we exploit local regularity properties, that is, strong concavity on well-chosen subsets of the domain. The non-negativity constraint is also integrated to the existing framework. Besides making safe screening possible to a broader class of functions that includes beta-divergences (e.g., the Kullback-Leibler divergence), the proposed approach also improves upon the existing Gap Safe screening rules on previously applicable cases (e.g., logistic regression). The proposed general framework is exemplified by some notable particular cases: logistic function, beta = 1.5 and Kullback-Leibler divergences. Finally, we showcase the effectiveness of the proposed screening rules with different solvers (coordinate descent, multiplicative-update and proximal gradient algorithms) and different data sets (binary classification, hyperspectral and count data).


page 20

page 22

page 24


GAP Safe screening rules for sparse multi-task and multi-class models

High dimensional regression benefits from sparsity promoting regularizat...

Accelerating Non-Negative and Bounded-Variable Linear Regression Algorithms with Safe Screening

Non-negative and bounded-variable linear regression problems arise in a ...

Gap Safe screening rules for sparsity enforcing penalties

In high dimensional regression settings, sparsity enforcing penalties ha...

Dual Extrapolation for Faster Lasso Solvers

Convex sparsity-inducing regularizations are ubiquitous in high-dimensio...

Mind the duality gap: safer rules for the Lasso

Screening rules allow to early discard irrelevant variables from the opt...

Safe Triplet Screening for Distance Metric Learning

We study safe screening for metric learning. Distance metric learning ca...

Safe Element Screening for Submodular Function Minimization

Submodular functions are discrete analogs of convex functions, which hav...