Local Convergence of an AMP Variant to the LASSO Solution in Finite Dimensions

07/18/2020
by   Yanting Ma, et al.
0

A common sparse linear regression formulation is the l1 regularized least squares, which is also known as least absolute shrinkage and selection operator (LASSO). Approximate message passing (AMP) has been proved to asymptotically achieve the LASSO solution when the regression matrix has independent and identically distributed (i.i.d.) Gaussian entries in the sense that the averaged per-coordinate l2 distance between the AMP iterates and the LASSO solution vanishes as the signal dimension goes to infinity before the iteration number. However, in finite dimensional settings, characterization of AMP iterates in the limit of large iteration number has not been established. In this work, we propose an AMP variant by including a parameter that depends on the largest singular value of the regression matrix. The proposed algorithm can also be considered as a primal dual hybrid gradient algorithm with adaptive stepsizes. We show that whenever the AMP variant converges, it converges to the LASSO solution for arbitrary finite dimensional regression matrices. Moreover, we show that the AMP variant is locally stable around the LASSO solution under the condition that the LASSO solution is unique and that the regression matrix is drawn from a continuous distribution. Our local stability result implies that in the special case where the regression matrix is large and has i.i.d. random entries, the original AMP, which is a special case of the proposed AMP variant, is locally stable around the LASSO solution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2018

SNAP: A semismooth Newton algorithm for pathwise optimization with optimal local convergence rate and oracle properties

We propose a semismooth Newton algorithm for pathwise optimization (SNAP...
research
01/19/2022

A Concise Tutorial on Approximate Message Passing

High-dimensional signal recovery of standard linear regression is a key ...
research
09/23/2013

Asymptotic Analysis of LASSOs Solution Path with Implications for Approximate Message Passing

This paper concerns the performance of the LASSO (also knows as basis pu...
research
07/02/2021

Asymptotic Statistical Analysis of Sparse Group LASSO via Approximate Message Passing Algorithm

Sparse Group LASSO (SGL) is a regularized model for high-dimensional lin...
research
11/26/2014

A note relating ridge regression and OLS p-values to preconditioned sparse penalized regression

When the design matrix has orthonormal columns, "soft thresholding" the ...
research
05/20/2018

The Generalized Lasso Problem and Uniqueness

We study uniqueness in the generalized lasso problem, where the penalty ...
research
01/31/2022

Sparse Signal Reconstruction with QUBO Formulation in l0-regularized Linear Regression

An l0-regularized linear regression for a sparse signal reconstruction i...

Please sign up or login with your details

Forgot password? Click here to reset