Improved Pathwise Coordinate Descent for Power Penalties

03/04/2022
by   Maryclare Griffin, et al.
0

Pathwise coordinate descent algorithms have been used to compute entire solution paths for lasso and other penalized regression problems quickly with great success. They improve upon cold start algorithms by solving the problems that make up the solution path sequentially for an ordered set of tuning parameter values, instead of solving each problem separately. However, extending pathwise coordinate descent algorithms to more the general bridge or power family of ℓ_q penalties is challenging. Faster algorithms for computing solution paths for these penalties are needed because ℓ_q penalized regression problems can be nonconvex and especially burdensome to solve. In this paper, we show that a reparameterization of ℓ_q penalized regression problems is more amenable to pathwise coordinate descent algorithms. This allows us to improve computation of the mode-thresholding function for ℓ_q penalized regression problems in practice and introduce two separate pathwise algorithms. We show that either pathwise algorithm is faster than the corresponding cold-start alternative, and demonstrate that different pathwise algorithms may be more likely to reach better solutions.

READ FULL TEXT
research
09/18/2021

Coordinate Descent for MCP/SCAD Penalized Least Squares Converges Linearly

Recovering sparse signals from observed data is an important topic in si...
research
11/26/2013

A Blockwise Descent Algorithm for Group-penalized Multiresponse and Multinomial Regression

In this paper we purpose a blockwise descent algorithm for group-penaliz...
research
09/09/2015

Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss Regression and Quantile Regression

We propose an algorithm, semismooth Newton coordinate descent (SNCD), fo...
research
11/23/2011

The Graphical Lasso: New Insights and Alternatives

The graphical lasso FHT2007a is an algorithm for learning the structure ...
research
11/13/2020

A Homotopy Coordinate Descent Optimization Method for l_0-Norm Regularized Least Square Problem

This paper proposes a homotopy coordinate descent (HCD) method to solve ...
research
10/26/2022

Coordinate Descent for SLOPE

The lasso is the most famous sparse regression and feature selection met...
research
02/17/2016

Large Scale Kernel Learning using Block Coordinate Descent

We demonstrate that distributed block coordinate descent can quickly sol...

Please sign up or login with your details

Forgot password? Click here to reset