Screening Rules and its Complexity for Active Set Identification

09/06/2020
by   Eugene Ndiaye, et al.
7

Screening rules were recently introduced as a technique for explicitly identifying active structures such as sparsity, in optimization problem arising in machine learning. This has led to new methods of acceleration based on a substantial dimension reduction. We show that screening rules stem from a combination of natural properties of subdifferential sets and optimality conditions, and can hence be understood in a unified way. Under mild assumptions, we analyze the number of iterations needed to identify the optimal active set for any converging algorithm. We show that it only depends on its convergence rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2016

GAP Safe Screening Rules for Sparse-Group-Lasso

In high dimensional settings, sparse structures are crucial for efficien...
research
02/08/2016

Simultaneous Safe Screening of Features and Samples in Doubly Sparse Modeling

The problem of learning a sparse model is conceptually interpreted as th...
research
04/19/2020

Safe Screening Rules for ℓ_0-Regression

We give safe screening rules to eliminate variables from regression with...
research
05/12/2021

Look-Ahead Screening Rules for the Lasso

The lasso is a popular method to induce shrinkage and sparsity in the so...
research
02/12/2018

Safe Triplet Screening for Distance Metric Learning

We study safe screening for metric learning. Distance metric learning ca...
research
10/22/2021

Safe rules for the identification of zeros in the solutions of the SLOPE problem

In this paper we propose a methodology to accelerate the resolution of t...
research
10/02/2020

Nonsmoothness in Machine Learning: specific structure, proximal identification, and applications

Nonsmoothness is often a curse for optimization; but it is sometimes a b...

Please sign up or login with your details

Forgot password? Click here to reset