Straggler Mitigation through Unequal Error Protection for Distributed Approximate Matrix Multiplication

03/04/2021
by   Busra Tegin, et al.
0

Large-scale machine learning and data mining methods routinely distribute computations across multiple agents to parallelize processing. The time required for the computations at the agents is affected by the availability of local resources giving rise to the "straggler problem". As a remedy to this problem, linear coding of the matrix sub-blocks can be used, i.e., the Parameter Server (PS) utilizes a channel code to encode the matrix sub-blocks and distributes these matrices to the workers for multiplication. In this paper, we employ Unequal Error Protection (UEP) codes to obtain an approximation of the matrix product in the distributed computation setting in the presence of stragglers. The resiliency level of each sub-block is chosen according to its norm, as blocks with larger norms have higher effects on the result of the matrix multiplication. In particular, we consider two approaches in distributing the matrix computation: (i) a row-times-column paradigm, and (ii) a column-times-row paradigm. For both paradigms, we characterize the performance of the proposed approach from a theoretical perspective by bounding the expected reconstruction error for matrices with uncorrelated entries. We also apply the proposed coding strategy to the computation of the back-propagation step in the training of a Deep Neural Network (DNN) for an image classification task in the evaluation of the gradient during back-propagation. Our numerical experiments show that it is indeed possible to obtain significant improvements in the overall time required to achieve the DNN training convergence by producing matrix product approximations using UEP codes.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset