Revisiting Subgradient Method: Complexity and Convergence Beyond Lipschitz Continuity

05/23/2023
by   Xiao Li, et al.
0

The subgradient method is one of the most fundamental algorithmic schemes for nonsmooth optimization. The existing complexity and convergence results for this algorithm are mainly derived for Lipschitz continuous objective functions. In this work, we first extend the typical complexity results for the subgradient method to convex and weakly convex minimization without assuming Lipschitz continuity. Specifically, we establish 𝒪(1/√(T)) bound in terms of the suboptimality gap “f(x) - f^*” for convex case and 𝒪(1/T^1/4) bound in terms of the gradient of the Moreau envelope function for weakly convex case. Furthermore, we provide convergence results for non-Lipschitz convex and weakly convex objective functions using proper diminishing rules on the step sizes. In particular, when f is convex, we show 𝒪(log(k)/√(k)) rate of convergence in terms of the suboptimality gap. With an additional quadratic growth condition, the rate is improved to 𝒪(1/k) in terms of the squared distance to the optimal solution set. When f is weakly convex, asymptotic convergence is derived. The central idea is that the dynamics of properly chosen step sizes rule fully controls the movement of the subgradient method, which leads to boundedness of the iterates, and then a trajectory-based analysis can be conducted to establish the desired results. To further illustrate the wide applicability of our framework, we extend the complexity results to the truncated subgradient, the stochastic subgradient, the incremental subgradient, and the proximal subgradient methods for non-Lipschitz functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2023

A Unified Analysis for the Subgradient Methods Minimizing Composite Nonconvex, Nonsmooth and Non-Lipschitz Functions

In this paper we propose a proximal subgradient method (Prox-SubGrad) fo...
research
03/17/2018

Stochastic model-based minimization of weakly convex functions

We consider an algorithm that successively samples and minimizes stochas...
research
05/29/2020

Long term dynamics of the subgradient method for Lipschitz path differentiable functions

We consider the long-term dynamics of the vanishing stepsize subgradient...
research
06/30/2022

Randomized Coordinate Subgradient Method for Nonsmooth Optimization

Nonsmooth optimization finds wide applications in many engineering field...
research
05/25/2021

An incremental descent method for multi-objective optimization

Current state-of-the-art multi-objective optimization solvers, by comput...
research
09/22/2019

Contractivity of Runge-Kutta methods for convex gradient systems

We consider the application of Runge-Kutta (RK) methods to gradient syst...
research
07/05/2019

A-priori error analysis of local incremental minimization schemes for rate-independent evolutions

This paper is concerned with a priori error estimates for the local incr...

Please sign up or login with your details

Forgot password? Click here to reset