Distributed Dynamic Safe Screening Algorithms for Sparse Regularization

04/23/2022
by   Runxue Bao, et al.
0

Distributed optimization has been widely used as one of the most efficient approaches for model training with massive samples. However, large-scale learning problems with both massive samples and high-dimensional features widely exist in the era of big data. Safe screening is a popular technique to speed up high-dimensional models by discarding the inactive features with zero coefficients. Nevertheless, existing safe screening methods are limited to the sequential setting. In this paper, we propose a new distributed dynamic safe screening (DDSS) method for sparsity regularized models and apply it on shared-memory and distributed-memory architecture respectively, which can achieve significant speedup without any loss of accuracy by simultaneously enjoying the sparsity of the model and dataset. To the best of our knowledge, this is the first work of distributed safe dynamic screening method. Theoretically, we prove that the proposed method achieves the linear convergence rate with lower overall complexity and can eliminate almost all the inactive features in a finite number of iterations almost surely. Finally, extensive experimental results on benchmark datasets confirm the superiority of our proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2021

Dynamic Sasvi: Strong Safe Screening for Norm-Regularized Least Squares

A recently introduced technique for a sparse optimization problem called...
research
02/08/2016

Simultaneous Safe Screening of Features and Samples in Doubly Sparse Modeling

The problem of learning a sparse model is conceptually interpreted as th...
research
06/29/2020

Fast OSCAR and OWL Regression via Safe Screening Rules

Ordered Weighted L_1 (OWL) regularized regression is a new regression an...
research
11/27/2021

Safe Screening for Sparse Conditional Random Fields

Sparse Conditional Random Field (CRF) is a powerful technique in compute...
research
08/11/2022

An Accelerated Doubly Stochastic Gradient Method with Faster Explicit Model Identification

Sparsity regularized loss minimization problems play an important role i...
research
07/02/2021

Screening for a Reweighted Penalized Conditional Gradient Method

The conditional gradient method (CGM) is widely used in large-scale spar...
research
02/15/2022

Accelerating Non-Negative and Bounded-Variable Linear Regression Algorithms with Safe Screening

Non-negative and bounded-variable linear regression problems arise in a ...

Please sign up or login with your details

Forgot password? Click here to reset