Near Optimal Coded Data Shuffling for Distributed Learning

01/05/2018
by   Mohamed A. Attia, et al.
0

Data shuffling between distributed cluster of nodes is one of the critical steps in implementing large-scale learning algorithms. Randomly shuffling the data-set among a cluster of workers allows different nodes to obtain fresh data assignments at each learning epoch. This process has been shown to provide improvements in the learning process. However, the statistical benefits of distributed data shuffling come at the cost of extra communication overhead from the master node to worker nodes, and can act as one of the major bottlenecks in the overall time for computation. There has been significant recent interest in devising approaches to minimize this communication overhead. One approach is to provision for extra storage at the computing nodes. The other emerging approach is to leverage coded communication to minimize the overall communication overhead. The focus of this work is to understand the fundamental trade-off between the amount of storage and the communication overhead for distributed data shuffling. In this work, we first present an information theoretic formulation for the data shuffling problem, accounting for the underlying problem parameters (number of workers, K, number of data points, N, and the available storage, S per node). We then present an information theoretic lower bound on the communication overhead for data shuffling as a function of these parameters. We next present a novel coded communication scheme and show that the resulting communication overhead of the proposed scheme is within a multiplicative factor of at most K/K-1 from the information-theoretic lower bound. Furthermore, we present the aligned coded shuffling scheme for some storage values, which achieves the optimal storage vs communication trade-off for K<5, and further reduces the maximum multiplicative gap down to K-1/3/K-1, for K≥ 5.

READ FULL TEXT

page 13

page 14

page 23

research
10/21/2020

Coded Computing for Master-Aided Distributed Computing Systems

We consider a MapReduce-type task running in a distributed computing mod...
research
01/13/2018

Distributed Multi-User Secret Sharing

A distributed secret sharing system is considered that consists of a dea...
research
11/22/2017

Combating Computational Heterogeneity in Large-Scale Distributed Computing via Work Exchange

Owing to data-intensive large-scale applications, distributed computatio...
research
12/14/2022

Coded Computation of Multiple Functions

We consider the problem of evaluating arbitrary multivariate polynomials...
research
12/16/2022

Coded Distributed Computing for Hierarchical Multi-task Learning

In this paper, we consider a hierarchical distributed multi-task learnin...
research
05/14/2019

Coded Distributed Tracking

We consider the problem of tracking the state of a process that evolves ...
research
07/11/2018

On the Fundamental Limits of Coded Data Shuffling for Distributed Learning Systems

We consider the data shuffling problem in a distributed learning system,...

Please sign up or login with your details

Forgot password? Click here to reset