Conditional Accelerated Lazy Stochastic Gradient Descent

03/16/2017
by   Guanghui Lan, et al.
0

In this work we introduce a conditional accelerated lazy stochastic gradient descent algorithm with optimal number of calls to a stochastic first-order oracle and convergence rate O(1/ε^2) improving over the projection-free, Online Frank-Wolfe based stochastic gradient descent of Hazan and Kale [2012] with convergence rate O(1/ε^4).

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset