On the Tradeoff Between Computation and Communication Costs for Distributed Linearly Separable Computation

10/04/2020
by   Kai Wan, et al.
0

This paper studies the distributed linearly separable computation problem, which is a generalization of many existing distributed computing problems such as distributed gradient descent and distributed linear transform. In this problem, a master asks N distributed workers to compute a linearly separable function of K datasets, which is a set of K_c linear combinations of K messages (each message is a function of one dataset). We assign some datasets to each worker, which then computes the corresponding messages and returns some function of these messages, such that from the answers of any N_r out of N workers the master can recover the task function. In the literature, the specific case where K_c = 1 or where the computation cost is minimum has been considered. In this paper, we focus on the general case (i.e., general K_c and general computation cost) and aim to find the minimum communication cost. We first propose a novel converse bound on the communication cost under the constraint of the popular cyclic assignment (widely considered in the literature), which assigns the datasets to the workers in a cyclic way. Motivated by the observation that existing strategies for distributed computing fall short of achieving the converse bound, we propose a novel distributed computing scheme for some system parameters. The proposed computing scheme is optimal for any assignment when K_c is large and is optimal under cyclic assignment when the numbers of workers and datasets are equal or K_c is small. In addition, it is order optimal within a factor of 2 under cyclic assignment for the remaining cases.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset