Towards Efficient Scheduling of Federated Mobile Devices under Computational and Statistical Heterogeneity

05/25/2020
by   Cong Wang, et al.
5

Originated from distributed learning, federated learning enables privacy-preserved collaboration on a new abstracted level by sharing the model parameters only. While the current research mainly focuses on optimizing learning algorithms and minimizing communication overhead left by distributed learning, there is still a considerable gap when it comes to the real implementation on mobile devices. In this paper, we start with an empirical experiment to demonstrate computation heterogeneity is a more pronounced bottleneck than communication on the current generation of battery-powered mobile devices, and the existing methods are haunted by mobile stragglers. Further, non-identically distributed data across the mobile users makes the selection of participants critical to the accuracy and convergence. To tackle the computational and statistical heterogeneity, we utilize data as a tunable knob and propose two efficient polynomial-time algorithms to schedule different workloads on various mobile devices, when data is identically or non-identically distributed. For identically distributed data, we combine partitioning and linear bottleneck assignment to achieve near-optimal training time without accuracy loss. For non-identically distributed data, we convert it into an average cost minimization problem and propose a greedy algorithm to find a reasonable balance between computation time and accuracy. We also establish an offline profiler to quantify the runtime behavior of different devices, which serves as the input to the scheduling algorithms. We conduct extensive experiments on a mobile testbed with two datasets and up to 20 devices. Compared with the common benchmarks, the proposed algorithms achieve 2-100x speedup epoch-wise, 2-7 more than 100

READ FULL TEXT

page 2

page 4

page 5

page 6

page 8

page 9

page 10

page 14

research
05/16/2019

Incentive Design for Efficient Federated Learning in Mobile Networks: A Contract Theory Approach

To strengthen data privacy and security, federated learning as an emergi...
research
04/29/2022

Energy Minimization for Federated Asynchronous Learning on Battery-Powered Mobile Devices via Application Co-running

Energy is an essential, but often forgotten aspect in large-scale federa...
research
10/06/2021

Efficient and Private Federated Learning with Partially Trainable Networks

Federated learning is used for decentralized training of machine learnin...
research
09/28/2019

FedPAQ: A Communication-Efficient Federated Learning Method with Periodic Averaging and Quantization

Federated learning is a new distributed machine learning approach, where...
research
03/03/2020

FLAME: A Self-Adaptive Auto-labeling System for Heterogeneous Mobile Processors

How to accurately and efficiently label data on a mobile device is criti...
research
10/31/2022

Lita: Accelerating Distributed Training of Sparsely Activated Models

Scaling model parameters usually improves model quality, but at the pric...
research
12/20/2019

Heterogeneity-aware and communication-efficient distributed statistical inference

In multicenter research, individual-level data are often protected again...

Please sign up or login with your details

Forgot password? Click here to reset