Inexact SARAH Algorithm for Stochastic Optimization

11/25/2018
by   Lam M. Nguyen, et al.
0

We develop and analyze a variant of variance reducing stochastic gradient algorithm, known as SARAH, which does not require computation of the exact gradient. Thus this new method can be applied to general expectation minimization problems rather than only finite sum problems. While the original SARAH algorithm, as well as its predecessor, SVRG, require an exact gradient computation on each outer iteration, the inexact variant of SARAH (iSARAH), which we develop here, requires only stochastic gradient computed on a mini-batch of sufficient size. The proposed method combines variance reduction via sample size selection and iterative stochastic gradient updates. We analyze the convergence rate of the algorithms for strongly convex, convex, and nonconvex cases with appropriate mini-batch size selected for each case. We show that with an additional, reasonable, assumption iSARAH achieves the best known complexity among stochastic methods in the case of general convex case stochastic value functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/01/2017

Doubly Accelerated Stochastic Variance Reduced Dual Averaging Method for Regularized Empirical Risk Minimization

In this paper, we develop a new accelerated stochastic gradient method f...
research
10/24/2019

Katyusha Acceleration for Convex Finite-Sum Compositional Optimization

Structured problems arise in many applications. To solve these problems,...
research
11/06/2017

AdaBatch: Efficient Gradient Aggregation Rules for Sequential and Parallel Stochastic Gradient Methods

We study a new aggregation operator for gradients coming from a mini-bat...
research
10/30/2017

Adaptive Sampling Strategies for Stochastic Optimization

In this paper, we propose a stochastic optimization method that adaptive...
research
09/12/2016

Less than a Single Pass: Stochastically Controlled Stochastic Gradient Method

We develop and analyze a procedure for gradient-based optimization that ...
research
09/06/2018

Stochastically Controlled Stochastic Gradient for the Convex and Non-convex Composition problem

In this paper, we consider the convex and non-convex composition problem...
research
11/08/2018

Plug-In Stochastic Gradient Method

Plug-and-play priors (PnP) is a popular framework for regularized signal...

Please sign up or login with your details

Forgot password? Click here to reset