SNAP: A semismooth Newton algorithm for pathwise optimization with optimal local convergence rate and oracle properties

10/09/2018
by   Jian Huang, et al.
0

We propose a semismooth Newton algorithm for pathwise optimization (SNAP) for the LASSO and Enet in sparse, high-dimensional linear regression. SNAP is derived from a suitable formulation of the KKT conditions based on Newton derivatives. It solves the semismooth KKT equations efficiently by actively and continuously seeking the support of the regression coefficients along the solution path with warm start. At each knot in the path, SNAP converges locally superlinearly for the Enet criterion and achieves an optimal local convergence rate for the LASSO criterion, i.e., SNAP converges in one step at the cost of two matrix-vector multiplication per iteration. Under certain regularity conditions on the design matrix and the minimum magnitude of the nonzero elements of the target regression coefficients, we show that SNAP hits a solution with the same signs as the regression coefficients and achieves a sharp estimation error bound in finite steps with high probability. The computational complexity of SNAP is shown to be the same as that of LARS and coordinate descent algorithms per iteration. Simulation studies and real data analysis support our theoretical results and demonstrate that SNAP is faster and accurate than LARS and coordinate descent algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/27/2020

On Newton Screening

Screening and working set techniques are important approaches to reducin...
research
07/18/2020

Local Convergence of an AMP Variant to the LASSO Solution in Finite Dimensions

A common sparse linear regression formulation is the l1 regularized leas...
research
02/09/2020

ℓ_0-Regularized High-dimensional Accelerated Failure Time Model

We develop a constructive approach for ℓ_0-penalized estimation in the s...
research
03/06/2022

A Better Computational Framework for L_2E Regression

Building on previous research of Chi and Chi (2022), the current paper r...
research
09/09/2015

Semismooth Newton Coordinate Descent Algorithm for Elastic-Net Penalized Huber Loss Regression and Quantile Regression

We propose an algorithm, semismooth Newton coordinate descent (SNCD), fo...
research
02/28/2020

Modelling High-Dimensional Categorical Data Using Nonconvex Fusion Penalties

We propose a method for estimation in high-dimensional linear models wit...
research
06/01/2015

Coordinate Descent Converges Faster with the Gauss-Southwell Rule Than Random Selection

There has been significant recent work on the theory and application of ...

Please sign up or login with your details

Forgot password? Click here to reset