A Universally Optimal Multistage Accelerated Stochastic Gradient Method

01/23/2019
by   Necdet Serhat Aybat, et al.
0

We study the problem of minimizing a strongly convex and smooth function when we have noisy estimates of its gradient. We propose a novel multistage accelerated algorithm that is universally optimal in the sense that it achieves the optimal rate both in the deterministic and stochastic case and operates without knowledge of noise characteristics. The algorithm consists of stages that use a stochastic version of Nesterov's accelerated algorithm with a specific restart and parameters selected to achieve the fastest reduction in the bias-variance terms in the convergence rate bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2015

Accelerated Stochastic Gradient Descent for Minimizing Finite Sums

We propose an optimization method for minimizing the finite sums of smoo...
research
02/13/2020

An Optimal Multistage Stochastic Gradient Method for Minimax Problems

In this paper, we study the minimax optimization problem in the smooth a...
research
07/04/2023

Accelerated stochastic approximation with state-dependent noise

We consider a class of stochastic smooth convex optimization problems un...
research
06/17/2022

Optimal Extragradient-Based Bilinearly-Coupled Saddle-Point Optimization

We consider the smooth convex-concave bilinearly-coupled saddle-point pr...
research
05/31/2018

On Acceleration with Noise-Corrupted Gradients

Accelerated algorithms have broad applications in large-scale optimizati...
research
07/19/2017

Acceleration and Averaging in Stochastic Mirror Descent Dynamics

We formulate and study a general family of (continuous-time) stochastic ...
research
03/03/2022

Accelerated SGD for Non-Strongly-Convex Least Squares

We consider stochastic approximation for the least squares regression pr...

Please sign up or login with your details

Forgot password? Click here to reset