Convergence of adaptive algorithms for weakly convex constrained optimization

06/11/2020
by   Ahmet Alacaoglu, et al.
0

We analyze the adaptive first order algorithm AMSGrad, for solving a constrained stochastic optimization problem with a weakly convex objective. We prove the Õ(t^-1/4) rate of convergence for the norm of the gradient of Moreau envelope, which is the standard stationarity measure for this class of problems. It matches the known rates that adaptive algorithms enjoy for the specific case of unconstrained smooth stochastic optimization. Our analysis works with mini-batch size of 1, constant first and second order moment parameters, and possibly unbounded optimization domains. Finally, we illustrate the applications and extensions of our results to specific problems and algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2020

Adaptive First-and Zeroth-order Methods for Weakly Convex Stochastic Optimization Problems

In this paper, we design and analyze a new family of adaptive subgradien...
research
01/16/2023

Distributionally Robust Learning with Weakly Convex Losses: Convergence Rates and Finite-Sample Guarantees

We consider a distributionally robust stochastic optimization problem an...
research
06/23/2021

Bayesian Joint Chance Constrained Optimization: Approximations and Statistical Consistency

This paper considers data-driven chance-constrained stochastic optimizat...
research
03/20/2019

The importance of better models in stochastic optimization

Standard stochastic optimization methods are brittle, sensitive to steps...
research
03/29/2022

Convergence and Complexity of Stochastic Subgradient Methods with Dependent Data for Nonconvex Optimization

We show that under a general dependent data sampling scheme, the classic...
research
01/14/2021

Towards Practical Adam: Non-Convexity, Convergence Theory, and Mini-Batch Acceleration

Adam is one of the most influential adaptive stochastic algorithms for t...
research
02/08/2018

Stochastic subgradient method converges at the rate O(k^-1/4) on weakly convex functions

We prove that the projected stochastic subgradient method, applied to a ...

Please sign up or login with your details

Forgot password? Click here to reset