An abstract convergence framework with application to inertial inexact forward–backward methods

02/15/2023
by   Silvia Bonettini, et al.
0

In this paper we introduce a novel abstract descent scheme suited for the minimization of proper and lower semicontinuous functions. The proposed abstract scheme generalizes a set of properties that are crucial for the convergence of several first-order methods designed for nonsmooth nonconvex optimization problems. Such properties guarantee the convergence of the full sequence of iterates to a stationary point, if the objective function satisfies the Kurdyka-Lojasiewicz property. The abstract framework allows for the design of new algorithms. We propose two inertial-type algorithms with implementable inexactness criteria for the main iteration update step. The first algorithm, i^2Piano, exploits large steps by adjusting a local Lipschitz constant. The second algorithm, iPila, overcomes the main drawback of line-search based methods by enforcing a descent only on a merit function instead of the objective function. Both algorithms have the potential to escape local minimizers (or stationary points) by leveraging the inertial feature. Moreover, they are proved to enjoy the full convergence guarantees of the abstract descent scheme, which is the best we can expect in such a general nonsmooth nonconvex optimization setup using first-order methods. The efficiency of the proposed algorithms is demonstrated on two exemplary image deblurring problems, where we can appreciate the benefits of performing a linesearch along the descent direction inside an inertial scheme.

READ FULL TEXT
research
03/09/2020

A block inertial Bregman proximal algorithm for nonsmooth nonconvex problems

In this paper, a block inertial Bregman proximal algorithm, namely [], f...
research
05/05/2020

Inertial Stochastic PALM and its Application for Learning Student-t Mixture Models

Inertial algorithms for minimizing nonsmooth and nonconvex functions as ...
research
05/10/2019

Inexact Block Coordinate Descent Algorithms for Nonsmooth Nonconvex Optimization

In this paper, we propose an inexact block coordinate descent algorithm ...
research
10/14/2021

Escaping Saddle Points in Nonconvex Minimax Optimization via Cubic-Regularized Gradient Descent-Ascent

The gradient descent-ascent (GDA) algorithm has been widely applied to s...
research
09/12/2017

A convergence frame for inexact nonconvex and nonsmooth algorithms and its applications to several iterations

In this paper, we consider the convergence of an abstract inexact noncon...
research
10/24/2019

A nonsmooth nonconvex descent algorithm

The paper presents a new descent algorithm for locally Lipschitz continu...
research
06/28/2018

Successive Convex Approximation Algorithms for Sparse Signal Estimation with Nonconvex Regularizations

In this paper, we propose a successive convex approximation framework fo...

Please sign up or login with your details

Forgot password? Click here to reset