A nonsmooth nonconvex descent algorithm

10/24/2019
by   Jan Mankau, et al.
0

The paper presents a new descent algorithm for locally Lipschitz continuous functions f:X→R. The selection of a descent direction at some iteration point x combines an approximation of the set-valued gradient of f on a suitable neighborhood of x (recently introduced by Mankau Schuricht) with an Armijo type step control. The algorithm is analytically justified and it is shown that accumulation points of iteration points are critical points of f. Finally the algorithm is tested for numerous benchmark problems and the results are compared with simulations found in the literature.

READ FULL TEXT
research
07/23/2019

Heavy-ball Algorithms Always Escape Saddle Points

Nonconvex optimization algorithms with random initialization have attrac...
research
06/20/2018

Stochastic Nested Variance Reduction for Nonconvex Optimization

We study finite-sum nonconvex optimization problems, where the objective...
research
11/11/2019

Convergence to minima for the continuous version of Backtracking Gradient Descent

The main result of this paper is: Theorem. Let f:R^k→R be a C^1 funct...
research
02/15/2023

An abstract convergence framework with application to inertial inexact forward–backward methods

In this paper we introduce a novel abstract descent scheme suited for th...
research
08/25/2022

An apocalypse-free first-order low-rank optimization algorithm with at most one rank reduction attempt per iteration

We consider the problem of minimizing a differentiable function with loc...
research
05/25/2021

An incremental descent method for multi-objective optimization

Current state-of-the-art multi-objective optimization solvers, by comput...
research
01/11/2022

An apocalypse-free first-order low-rank optimization algorithm

We consider the problem of minimizing a differentiable function with loc...

Please sign up or login with your details

Forgot password? Click here to reset