Stochastic subgradient method converges at the rate O(k^-1/4) on weakly convex functions

02/08/2018
by   Damek Davis, et al.
0

We prove that the projected stochastic subgradient method, applied to a weakly convex problem, drives the gradient of the Moreau envelope to zero at the rate O(k^-1/4).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/17/2018

Stochastic model-based minimization of weakly convex functions

We consider an algorithm that successively samples and minimizes stochas...
research
04/28/2020

Distributed Projected Subgradient Method for Weakly Convex Optimization

The stochastic subgradient method is a widely-used algorithm for solving...
research
01/30/2023

Delayed Stochastic Algorithms for Distributed Weakly Convex Optimization

This paper studies delayed stochastic algorithms for weakly convex optim...
research
06/11/2020

Convergence of adaptive algorithms for weakly convex constrained optimization

We analyze the adaptive first order algorithm AMSGrad, for solving a con...
research
02/05/2020

Completing Simple Valuations in K-categories

We prove that Keimel and Lawson's K-completion Kc of the simple valuatio...
research
02/04/2021

Escaping Saddle Points for Nonsmooth Weakly Convex Functions via Perturbed Proximal Algorithms

We propose perturbed proximal algorithms that can provably escape strict...
research
01/16/2023

Distributionally Robust Learning with Weakly Convex Losses: Convergence Rates and Finite-Sample Guarantees

We consider a distributionally robust stochastic optimization problem an...

Please sign up or login with your details

Forgot password? Click here to reset