A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem

03/14/2012
by   Lin Xiao, et al.
0

We consider solving the ℓ_1-regularized least-squares (ℓ_1-LS) problem in the context of sparse recovery, for applications such as compressed sensing. The standard proximal gradient method, also known as iterative soft-thresholding when applied to this problem, has low computational cost per iteration but a rather slow convergence rate. Nevertheless, when the solution is sparse, it often exhibits fast linear convergence in the final stage. We exploit the local linear convergence using a homotopy continuation strategy, i.e., we solve the ℓ_1-LS problem for a sequence of decreasing values of the regularization parameter, and use an approximate solution at the end of each stage to warm start the next stage. Although similar strategies have been studied in the literature, there have been no theoretical analysis of their global iteration complexity. This paper shows that under suitable assumptions for sparse recovery, the proposed homotopy strategy ensures that all iterates along the homotopy solution path are sparse. Therefore the objective function is effectively strongly convex along the solution path, and geometric convergence at each stage can be established. As a result, the overall iteration complexity of our method is O((1/ϵ)) for finding an ϵ-optimal solution, which can be interpreted as global geometric rate of convergence. We also present empirical results to support our theoretical analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2013

Optimal computational and statistical rates of convergence for sparse nonconvex learning problems

We provide theoretical analysis of the statistical and computational pro...
research
07/02/2021

A geometric proximal gradient method for sparse least squares regression with probabilistic simplex constraint

In this paper, we consider the sparse least squares regression problem w...
research
05/22/2021

On anisotropic non-Lipschitz restoration model: lower bound theory and convergent algorithm

For nonconvex and nonsmooth restoration models, the lower bound theory r...
research
10/15/2019

IRLS for Sparse Recovery Revisited: Examples of Failure and a Remedy

Compressed sensing is a central topic in signal processing with myriad a...
research
11/24/2015

Generalized Conjugate Gradient Methods for ℓ_1 Regularized Convex Quadratic Programming with Finite Convergence

The conjugate gradient (CG) method is an efficient iterative method for ...
research
12/22/2020

Iteratively Reweighted Least Squares for ℓ_1-minimization with Global Linear Convergence Rate

Iteratively Reweighted Least Squares (IRLS), whose history goes back mor...
research
05/30/2018

Theoretical Bounds on MAP Estimation in Distributed Sensing Networks

The typical approach for recovery of spatially correlated signals is reg...

Please sign up or login with your details

Forgot password? Click here to reset