DeepAI AI Chat
Log In Sign Up

Accelerating Non-Negative and Bounded-Variable Linear Regression Algorithms with Safe Screening

by   Cassio Dantas, et al.

Non-negative and bounded-variable linear regression problems arise in a variety of applications in machine learning and signal processing. In this paper, we propose a technique to accelerate existing solvers for these problems by identifying saturated coordinates in the course of iterations. This is akin to safe screening techniques previously proposed for sparsity-regularized regression problems. The proposed strategy is provably safe as it provides theoretical guarantees that the identified coordinates are indeed saturated in the optimal solution. Experimental results on synthetic and real data show compelling accelerations for both non-negative and bounded-variable problems.


page 1

page 2

page 3

page 4


Expanding boundaries of Gap Safe screening

Sparse optimization problems are ubiquitous in many fields such as stati...

Unbalanced Optimal Transport through Non-negative Penalized Linear Regression

This paper addresses the problem of Unbalanced Optimal Transport (UOT) i...

Safe Screening Rules for ℓ_0-Regression

We give safe screening rules to eliminate variables from regression with...

Safe Element Screening for Submodular Function Minimization

Submodular functions are discrete analogs of convex functions, which hav...

Adversarial Regression with Doubly Non-negative Weighting Matrices

Many machine learning tasks that involve predicting an output response c...

A New Perspective on Debiasing Linear Regressions

In this paper, we propose an abstract procedure for debiasing constraine...

An Improved Bayesian Framework for Quadrature of Constrained Integrands

Quadrature is the problem of estimating intractable integrals, a problem...