Distributed Linearly Separable Computation

07/01/2020
by   Kai Wan, et al.
0

This paper formulates a distributed computation problem, where a master asks N distributed workers to compute a linearly separable function. The task function can be expressed as K_c linear combinations to K messages, where each message is a function of one dataset. Our objective is to find the optimal tradeoff between the computation cost (number of datasets assigned to each worker) and the communication cost (number of symbols the master should download), such that from the answers of any N_r out of N workers the master can recover the task function. The formulated problem can be seen as the generalized version of some existing problems, such as distributed gradient descent and distributed linear transform. In this paper, we consider the specific case where the computation cost is minimum, and propose novel converse and achievable bounds on the optimal communication cost. The proposed bounds coincide for some system parameters; when they do not match, we prove that the achievable distributed computing scheme is optimal under the constraint of a widely used `cyclic assignment' on the datasets. Our results also show that when K = N, with the same communication cost as the optimal distributed gradient descent coding scheme propose by Tandon et al. from which the master recovers one linear combination of K messages, our proposed scheme can let the master recover any additional N_r -1 independent linear combinations of messages with high probability.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset