A Unified Convergence Theorem for Stochastic Optimization Methods

06/08/2022
by   Xiao Li, et al.
35

In this work, we provide a fundamental unified convergence theorem used for deriving expected and almost sure convergence results for a series of stochastic optimization methods. Our unified theorem only requires to verify several representative conditions and is not tailored to any specific algorithm. As a direct application, we recover expected and almost sure convergence results of the stochastic gradient method (SGD) and random reshuffling (RR) under more general settings. Moreover, we establish new expected and almost sure convergence results for the stochastic proximal gradient method (prox-SGD) and stochastic model-based methods (SMM) for nonsmooth nonconvex optimization problems. These applications reveal that our unified theorem provides a plugin-type convergence analysis and strong convergence guarantees for a wide class of stochastic optimization methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2017

A Unified Analysis of Stochastic Optimization Methods Using Jump System Theory and Quadratic Constraints

We develop a simple routine unifying the analysis of several important r...
research
02/26/2018

Shampoo: Preconditioned Stochastic Tensor Optimization

Preconditioned gradient methods are among the most general and powerful ...
research
07/08/2022

Tackling Data Heterogeneity: A New Unified Framework for Decentralized SGD with Sample-induced Topology

We develop a general framework unifying several gradient-based stochasti...
research
09/05/2023

PROMISE: Preconditioned Stochastic Optimization Methods by Incorporating Scalable Curvature Estimates

This paper introduces PROMISE (Preconditioned Stochastic Optimization Me...
research
03/29/2023

Unified analysis of SGD-type methods

This note focuses on a simple approach to the unified analysis of SGD-ty...
research
07/20/2018

signProx: One-Bit Proximal Algorithm for Nonconvex Stochastic Optimization

Stochastic gradient descent (SGD) is one of the most widely used optimiz...
research
10/21/2020

Data augmentation as stochastic optimization

We present a theoretical framework recasting data augmentation as stocha...

Please sign up or login with your details

Forgot password? Click here to reset