Near-Optimal Straggler Mitigation for Distributed Gradient Methods

10/27/2017
by   Songze Li, et al.
0

Modern learning algorithms use gradient descent updates to train inferential models that best explain data. Scaling these approaches to massive data sizes requires proper distributed gradient descent schemes where distributed worker nodes compute partial gradients based on their partial and local data sets, and send the results to a master node where all the computations are aggregated into a full gradient and the learning model is updated. However, a major performance bottleneck that arises is that some of the worker nodes may run slow. These nodes a.k.a. stragglers can significantly slow down computation as the slowest node may dictate the overall computational time. We propose a distributed computing scheme, called Batched Coupon's Collector (BCC) to alleviate the effect of stragglers in gradient methods. We prove that our BCC scheme is robust to a near optimal number of random stragglers. We also empirically demonstrate that our proposed BCC scheme reduces the run-time by up to 85.4 strategies. We also generalize the proposed BCC scheme to minimize the completion time when implementing gradient descent-based algorithms over heterogeneous worker nodes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2018

Polynomially Coded Regression: Optimal Straggler Mitigation via Data Encoding

We consider the problem of training a least-squares regression model on ...
research
03/31/2018

Fundamental Resource Trade-offs for Encoded Distributed Optimization

Dealing with the shear size and complexity of today's massive data sets ...
research
01/27/2019

Heterogeneity-aware Gradient Coding for Straggler Tolerance

Gradient descent algorithms are widely used in machine learning. In orde...
research
01/15/2019

Distributed Stochastic Gradient Descent Using LDGM Codes

We consider a distributed learning problem in which the computation is c...
research
03/02/2021

Optimal Communication-Computation Trade-Off in Heterogeneous Gradient Coding

Gradient coding allows a master node to derive the aggregate of the part...
research
05/13/2021

Approximate Gradient Coding for Heterogeneous Nodes

In distributed machine learning (DML), the training data is distributed ...
research
07/12/2017

Gradient Coding from Cyclic MDS Codes and Expander Graphs

Gradient Descent, and its variants, are a popular method for solving emp...

Please sign up or login with your details

Forgot password? Click here to reset