Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint

02/09/2015
by   Renliang Gu, et al.
0

We develop a projected Nesterov's proximal-gradient (PNPG) approach for sparse signal reconstruction that combines adaptive step size with Nesterov's momentum acceleration. The objective function that we wish to minimize is the sum of a convex differentiable data-fidelity (negative log-likelihood (NLL)) term and a convex regularization term. We apply sparse signal regularization where the signal belongs to a closed convex set within the closure of the domain of the NLL; the convex-set constraint facilitates flexible NLL domains and accurate signal recovery. Signal sparsity is imposed using the ℓ_1-norm penalty on the signal's linear transform coefficients or gradient map, respectively. The PNPG approach employs projected Nesterov's acceleration step with restart and an inner iteration to compute the proximal mapping. We propose an adaptive step-size selection scheme to obtain a good local majorizing function of the NLL and reduce the time spent backtracking. Thanks to step-size adaptation, PNPG does not require Lipschitz continuity of the gradient of the NLL. We present an integrated derivation of the momentum acceleration and its O(k^-2) convergence-rate and iterate convergence proofs, which account for adaptive step-size selection, inexactness of the iterative proximal mapping, and the convex-set constraint. The tuning of PNPG is largely application-independent. Tomographic and compressed-sensing reconstruction experiments with Poisson generalized linear and Gaussian linear measurement models demonstrate the performance of the proposed approach.

READ FULL TEXT
research
11/11/2021

Convergence and Stability of the Stochastic Proximal Point Algorithm with Momentum

Stochastic gradient descent with momentum (SGDM) is the dominant algorit...
research
04/06/2018

Adaptive Three Operator Splitting

We propose and analyze a novel adaptive step size variant of the Davis-Y...
research
01/12/2023

A Stochastic Proximal Polyak Step Size

Recently, the stochastic Polyak step size (SPS) has emerged as a competi...
research
09/05/2022

The Proxy Step-size Technique for Regularized Optimization on the Sphere Manifold

We give an effective solution to the regularized optimization problem g ...
research
08/25/2022

Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

This paper applies an idea of adaptive momentum for the nonlinear conjug...
research
05/15/2019

Iterative Alpha Expansion for estimating gradient-sparse signals from linear measurements

We consider estimating a piecewise-constant image, or a gradient-sparse ...
research
08/23/2023

An Accelerated Block Proximal Framework with Adaptive Momentum for Nonconvex and Nonsmooth Optimization

We propose an accelerated block proximal linear framework with adaptive ...

Please sign up or login with your details

Forgot password? Click here to reset