Accelerated Sparse Recovery via Gradient Descent with Nonlinear Conjugate Gradient Momentum

08/25/2022
by   Mengqi Hu, et al.
0

This paper applies an idea of adaptive momentum for the nonlinear conjugate gradient to accelerate optimization problems in sparse recovery. Specifically, we consider two types of minimization problems: a (single) differentiable function and the sum of a non-smooth function and a differentiable function. In the first case, we adopt a fixed step size to avoid the traditional line search and establish the convergence analysis of the proposed algorithm for a quadratic problem. This acceleration is further incorporated with an operator splitting technique to deal with the non-smooth function in the second case. We use the convex ℓ_1 and the nonconvex ℓ_1-ℓ_2 functionals as two case studies to demonstrate the efficiency of the proposed approaches over traditional methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/14/2016

On the Influence of Momentum Acceleration on Online Learning

The article examines in some detail the convergence rate and mean-square...
research
02/13/2020

Convergence of a Stochastic Gradient Method with Momentum for Nonsmooth Nonconvex Optimization

Stochastic gradient methods with momentum are widely used in application...
research
04/06/2018

Adaptive Three Operator Splitting

We propose and analyze a novel adaptive step size variant of the Davis-Y...
research
02/23/2023

A subgradient method with constant step-size for ℓ_1-composite optimization

Subgradient methods are the natural extension to the non-smooth case of ...
research
01/17/2020

Gradient descent with momentum — to accelerate or to super-accelerate?

We consider gradient descent with `momentum', a widely used method for l...
research
08/30/2020

Momentum-based Accelerated Mirror Descent Stochastic Approximation for Robust Topology Optimization under Stochastic Loads

Robust topology optimization (RTO) improves the robustness of designs wi...
research
02/09/2015

Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Reconstruction with a Convex Constraint

We develop a projected Nesterov's proximal-gradient (PNPG) approach for ...

Please sign up or login with your details

Forgot password? Click here to reset