Randomized Coordinate Subgradient Method for Nonsmooth Optimization

06/30/2022
by   Lei Zhao, et al.
5

Nonsmooth optimization finds wide applications in many engineering fields. In this work, we propose to utilize the Randomized Coordinate Subgradient Method (RCS) for solving both nonsmooth convex and nonsmooth nonconvex (nonsmooth weakly convex) optimization problems. At each iteration, RCS randomly selects one block coordinate rather than all the coordinates to update. Motivated by practical applications, we consider the linearly bounded subgradients assumption for the objective function, which is much more general than the Lipschitz continuity assumption. Under such a general assumption, we conduct thorough convergence analysis for RCS in both convex and nonconvex cases and establish both expected convergence rate and almost sure asymptotic convergence results. In order to derive these convergence results, we establish a convergence lemma and the relationship between the global metric subregularity properties of a weakly convex function and its Moreau envelope, which are fundamental and of independent interests. Finally, we conduct several experiments to show the possible superiority of RCS over the subgradient method.

READ FULL TEXT

page 18

page 28

page 29

page 30

research
01/15/2020

Randomized Bregman Coordinate Descent Methods for Non-Lipschitz Optimization

We propose a new randomized Bregman (block) coordinate descent (RBCD) me...
research
07/26/2019

Incremental Methods for Weakly Convex Optimization

We consider incremental algorithms for solving weakly convex optimizatio...
research
08/30/2023

A Unified Analysis for the Subgradient Methods Minimizing Composite Nonconvex, Nonsmooth and Non-Lipschitz Functions

In this paper we propose a proximal subgradient method (Prox-SubGrad) fo...
research
06/06/2023

Understanding Progressive Training Through the Framework of Randomized Coordinate Descent

We propose a Randomized Progressive Training algorithm (RPT) – a stochas...
research
09/03/2019

Efficiency of Coordinate Descent Methods For Structured Nonconvex Optimization

Novel coordinate descent (CD) methods are proposed for minimizing noncon...
research
05/23/2023

Revisiting Subgradient Method: Complexity and Convergence Beyond Lipschitz Continuity

The subgradient method is one of the most fundamental algorithmic scheme...
research
09/12/2017

A convergence frame for inexact nonconvex and nonsmooth algorithms and its applications to several iterations

In this paper, we consider the convergence of an abstract inexact noncon...

Please sign up or login with your details

Forgot password? Click here to reset