On the Convergence Analysis of Asynchronous SGD for Solving Consistent Linear Systems

04/05/2020
by   Atal Narayan Sahu, et al.
0

In the realm of big data and machine learning, data-parallel, distributed stochastic algorithms have drawn significant attention in the present days. While the synchronous versions of these algorithms are well understood in terms of their convergence, the convergence analyses of their asynchronous counterparts are not widely studied. In this paper, we propose and analyze a distributed, asynchronous parallel SGD in light of solving an arbitrary consistent linear system by reformulating the system into a stochastic optimization problem as studied by Richtárik and Takác̃ in [35]. We compare the convergence rates of our asynchronous SGD algorithm with the synchronous parallel algorithm proposed by Richtárik and Takáč in [35] under different choices of the hyperparameters—the stepsize, the damping factor, the number of processors, and the delay factor. We show that our asynchronous parallel SGD algorithm also enjoys a global linear convergence rate, similar to the basic method and the synchronous parallel method in [35] for solving any arbitrary consistent linear system via stochastic reformulation. We also show that our asynchronous parallel SGD improves upon the basic method with a better convergence rate when the number of processors is larger than four. We further show that this asynchronous approach performs asymptotically better than its synchronous counterpart for certain linear systems. Moreover, for certain linear systems, we compute the minimum number of processors required for which our asynchronous parallel SGD is better, and find that this number can be as low as two for some ill-conditioned problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2015

Fast Asynchronous Parallel Stochastic Gradient Decent

Stochastic gradient descent (SGD) and its variants have become more and ...
research
06/15/2022

Asynchronous SGD Beats Minibatch SGD Under Arbitrary Delays

The existing analysis of asynchronous stochastic gradient descent (SGD) ...
research
02/01/2020

Finite-Time Analysis of Asynchronous Stochastic Approximation and Q-Learning

We consider a general asynchronous Stochastic Approximation (SA) scheme ...
research
06/24/2020

Advances in Asynchronous Parallel and Distributed Optimization

Motivated by large-scale optimization problems arising in the context of...
research
03/11/2020

Evaluating Abstract Asynchronous Schwarz solvers

With the commencement of the exascale computing era, we realize that the...
research
03/11/2020

Evaluating Abstract Asynchronous Schwarz solvers on GPUs

With the commencement of the exascale computing era, we realize that the...
research
01/11/2018

Improved asynchronous parallel optimization analysis for stochastic incremental methods

As datasets continue to increase in size and multi-core computer archite...

Please sign up or login with your details

Forgot password? Click here to reset