Stochastic model-based minimization of weakly convex functions

03/17/2018
by   Damek Davis, et al.
0

We consider an algorithm that successively samples and minimizes stochastic models of the objective function. We show that under weak-convexity and Lipschitz conditions, the algorithm drives the expected norm of the gradient of the Moreau envelope to zero at the rate O(k^-1/4). Our result yields new complexity guarantees for the stochastic proximal point algorithm on weakly convex problems and for the stochastic prox-linear algorithm for minimizing compositions of convex functions with smooth maps. Moreover, our result also recovers the recently obtained complexity estimate for the stochastic proximal subgradient method on weakly convex problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/08/2018

Stochastic subgradient method converges at the rate O(k^-1/4) on weakly convex functions

We prove that the projected stochastic subgradient method, applied to a ...
04/18/2021

An extension of the proximal point algorithm beyond convexity

We introduce and investigate a new generalized convexity notion for func...
07/01/2018

Stochastic model-based minimization under high-order growth

Given a nonsmooth, nonconvex minimization problem, we consider algorithm...
04/20/2018

Stochastic subgradient method converges on tame functions

This work considers the question: what convergence guarantees does the s...
06/24/2015

Un-regularizing: approximate proximal point and faster stochastic algorithms for empirical risk minimization

We develop a family of accelerated stochastic algorithms that minimize s...
05/13/2014

Scalable sparse covariance estimation via self-concordance

We consider the class of convex minimization problems, composed of a sel...
02/04/2021

Escaping Saddle Points for Nonsmooth Weakly Convex Functions via Perturbed Proximal Algorithms

We propose perturbed proximal algorithms that can provably escape strict...