
On Tight Convergence Rates of Withoutreplacement SGD
For solving finitesum optimization problems, SGD without replacement sa...
read it

Better Theory for SGD in the Nonconvex World
Largescale nonconvex optimization problems are ubiquitous in modern mac...
read it

A unified variancereduced accelerated gradient method for convex optimization
We propose a novel randomized incremental gradient algorithm, namely, VA...
read it

Random Shuffling Beats SGD after Finite Epochs
A longstanding problem in the theory of stochastic gradient descent (SG...
read it

On the Complexity of Minimizing Convex Finite Sums Without Using the Indices of the Individual Functions
Recent advances in randomized incremental methods for minimizing Lsmoot...
read it

Random Shuffling Beats SGD Only After Many Epochs on IllConditioned Problems
Recently, there has been much interest in studying the convergence rates...
read it

PAGE: A Simple and Optimal Probabilistic Gradient Estimator for Nonconvex Optimization
In this paper, we propose a novel stochastic gradient estimator—ProbAbil...
read it
SGD with shuffling: optimal rates without component convexity and large epoch requirements
We study withoutreplacement SGD for solving finitesum optimization problems. Specifically, depending on how the indices of the finitesum are shuffled, we consider the SingleShuffle (shuffle only once) and RandomShuffle (shuffle at the beginning of each epoch) algorithms. First, we establish minimax optimal convergence rates of these algorithms up to polylog factors. Notably, our analysis is general enough to cover gradient dominated nonconvex costs, and does not rely on the convexity of individual component functions unlike existing optimal convergence results. Secondly, assuming convexity of the individual components, we further sharpen the tight convergence results for by removing the drawbacks common to all prior arts: large number of epochs required for the results to hold, and extra polylog factor gaps to the lower bound.
READ FULL TEXT
Comments
There are no comments yet.