Accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient

06/02/2022
by   Zhaosong Lu, et al.
0

In this paper we develop accelerated first-order methods for convex optimization with locally Lipschitz continuous gradient (LLCG), which is beyond the well-studied class of convex optimization with Lipschitz continuous gradient. In particular, we first consider unconstrained convex optimization with LLCG and propose accelerated proximal gradient (APG) methods for solving it. The proposed APG methods are equipped with a verifiable termination criterion and enjoy an operation complexity of O(ε^-1/2logε^-1) and O(logε^-1) for finding an ε-residual solution of an unconstrained convex and strongly convex optimization problem, respectively. We then consider constrained convex optimization with LLCG and propose an first-order proximal augmented Lagrangian method for solving it by applying one of our proposed APG methods to approximately solve a sequence of proximal augmented Lagrangian subproblems. The resulting method is equipped with a verifiable termination criterion and enjoys an operation complexity of O(ε^-1logε^-1) and O(ε^-1/2logε^-1) for finding an ε-KKT solution of a constrained convex and strongly convex optimization problem, respectively. All the proposed methods in this paper are parameter-free or almost parameter-free except that the knowledge on convexity parameter is required. To the best of our knowledge, no prior studies were conducted to investigate accelerated first-order methods with complexity guarantees for convex optimization with LLCG. All the complexity results obtained in this paper are entirely new.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2022

Efficient First-order Methods for Convex Optimization with Strongly Convex Function Constraints

Convex function constrained optimization has received growing research i...
research
01/12/2020

The Proximal Method of Multipliers for a Class of Nonsmooth Convex Optimization

This paper develops the proximal method of multipliers for a class of no...
research
08/25/2021

A New Insight on Augmented Lagrangian Method and Its Extensions

Motivated by the recent work [He-Yuan, Balanced Augmented Lagrangian Met...
research
03/27/2018

Iteration-complexity of first-order augmented Lagrangian methods for convex conic programming

In this paper we consider a class of convex conic programming. In partic...
research
06/17/2022

RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

The accelerated proximal point algorithm (APPA), also known as "Catalyst...

Please sign up or login with your details

Forgot password? Click here to reset