Proximal Subgradient Norm Minimization of ISTA and FISTA

11/03/2022
by   Bowen Li, et al.
0

For first-order smooth optimization, the research on the acceleration phenomenon has a long-time history. Until recently, the mechanism leading to acceleration was not successfully uncovered by the gradient correction term and its equivalent implicit-velocity form. Furthermore, based on the high-resolution differential equation framework with the corresponding emerging techniques, phase-space representation and Lyapunov function, the squared gradient norm of Nesterov's accelerated gradient descent () method at an inverse cubic rate is discovered. However, this result cannot be directly generalized to composite optimization widely used in practice, e.g., the linear inverse problem with sparse representation. In this paper, we meticulously observe a pivotal inequality used in composite optimization about the step size s and the Lipschitz constant L and find that it can be improved tighter. We apply the tighter inequality discovered in the well-constructed Lyapunov function and then obtain the proximal subgradient norm minimization by the phase-space representation, regardless of gradient-correction or implicit-velocity. Furthermore, we demonstrate that the squared proximal subgradient norm for the class of iterative shrinkage-thresholding algorithms (ISTA) converges at an inverse square rate, and the squared proximal subgradient norm for the class of faster iterative shrinkage-thresholding algorithms (FISTA) is accelerated to convergence at an inverse cubic rate.

READ FULL TEXT

page 3

page 4

research
09/19/2022

Gradient Norm Minimization of Nesterov Acceleration: o(1/k^3)

In the history of first-order algorithms, Nesterov's accelerated gradien...
research
12/13/2022

Linear Convergence of ISTA and FISTA

In this paper, we revisit the class of iterative shrinkage-thresholding ...
research
06/16/2023

Linear convergence of Nesterov-1983 with the strong convexity

For modern gradient-based optimization, a developmental landmark is Nest...
research
12/12/2022

Revisiting the acceleration phenomenon via high-resolution differential equations

Nesterov's accelerated gradient descent (NAG) is one of the milestones i...
research
04/28/2023

On Underdamped Nesterov's Acceleration

The high-resolution differential equation framework has been proven to b...
research
12/29/2016

Geometric descent method for convex composite minimization

In this paper, we extend the geometric descent method recently proposed ...
research
08/13/2013

Composite Self-Concordant Minimization

We propose a variable metric framework for minimizing the sum of a self-...

Please sign up or login with your details

Forgot password? Click here to reset