Estimate Sequences for Stochastic Composite Optimization: Variance Reduction, Acceleration, and Robustness to Noise

01/25/2019
by   Andrei Kulunchakov, et al.
0

In this paper, we propose a unified view of gradient-based algorithms for stochastic convex composite optimization. By extending the concept of estimate sequence introduced by Nesterov, we interpret a large class of stochastic optimization methods as procedures that iteratively minimize a surrogate of the objective. This point of view covers stochastic gradient descent (SGD), the variance-reduction approaches SAGA, SVRG, MISO, their proximal variants, and has several advantages: (i) we provide a simple generic proof of convergence for all of the aforementioned methods; (ii) we naturally obtain new algorithms with the same guarantees; (iii) we derive generic strategies to make these algorithms robust to stochastic noise, which is useful when data is corrupted by small random perturbations. Finally, we show that this viewpoint is useful to obtain accelerated algorithms.

READ FULL TEXT
research
05/07/2019

Estimate Sequences for Variance-Reduced Stochastic Composite Optimization

In this paper, we propose a unified view of gradient-based algorithms fo...
research
06/03/2019

A Generic Acceleration Framework for Stochastic Composite Optimization

In this paper, we introduce various mechanisms to obtain accelerated fir...
research
10/04/2016

Stochastic Optimization with Variance Reduction for Infinite Datasets with Finite-Sum Structure

Stochastic optimization algorithms with variance reduction have proven s...
research
08/18/2023

Variance reduction techniques for stochastic proximal point algorithms

In the context of finite sums minimization, variance reduction technique...
research
12/19/2022

Gradient Descent-Type Methods: Background and Simple Unified Convergence Analysis

In this book chapter, we briefly describe the main components that const...
research
09/07/2021

COCO Denoiser: Using Co-Coercivity for Variance Reduction in Stochastic Convex Optimization

First-order methods for stochastic optimization have undeniable relevanc...
research
05/31/2018

On Acceleration with Noise-Corrupted Gradients

Accelerated algorithms have broad applications in large-scale optimizati...

Please sign up or login with your details

Forgot password? Click here to reset