Complexity-Optimal and Curvature-Free First-Order Methods for Finding Stationary Points of Composite Optimization Problems

05/25/2022
by   Weiwei Kong, et al.
0

This paper develops and analyzes an accelerated proximal descent method for finding stationary points of nonconvex composite optimization problems. The objective function is of the form f+h where h is a proper closed convex function, f is a differentiable function on the domain of h, and ∇ f is Lipschitz continuous on the domain of h. The main advantage of this method is that it is "curvature-free" in the sense that it does not require knowledge of the Lipschitz constant of ∇ f or of any global topological properties of f. It is shown that the proposed method can obtain a ρ-approximate stationary point with iteration complexity bounds that are optimal, up to logarithmic terms over ρ, in both the convex and nonconvex settings. Some discussion is also given about how the proposed method can be leveraged in other existing optimization frameworks, such as min-max smoothing and penalty frameworks for constrained programming, to create more specialized curvature-free methods. Finally, numerical experiments on a set of nonconvex quadratic semidefinite programming problems are given to support the practical viability of the method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/15/2020

Randomized Bregman Coordinate Descent Methods for Non-Lipschitz Optimization

We propose a new randomized Bregman (block) coordinate descent (RBCD) me...
research
10/04/2022

Zeroth-Order Negative Curvature Finding: Escaping Saddle Points without Gradients

We consider escaping saddle points of nonconvex problems where only the ...
research
10/08/2021

Nonconvex-Nonconcave Min-Max Optimization with a Small Maximization Domain

We study the problem of finding approximate first-order stationary point...
research
03/27/2019

Optimization of Inf-Convolution Regularized Nonconvex Composite Problems

In this work, we consider nonconvex composite problems that involve inf-...
research
06/20/2023

Globally optimal solutions to a class of fractional optimization problems based on proximity gradient algorithm

We establish globally optimal solutions to a class of fractional optimiz...
research
04/05/2020

Regularized asymptotic descents for nonconvex optimization

In this paper we propose regularized asymptotic descent (RAD) methods fo...
research
01/13/2022

Consistent Approximations in Composite Optimization

Approximations of optimization problems arise in computational procedure...

Please sign up or login with your details

Forgot password? Click here to reset