Federated Learning Under Intermittent Client Availability and Time-Varying Communication Constraints

05/13/2022
by   Mónica Ribero, et al.
3

Federated learning systems facilitate training of global models in settings where potentially heterogeneous data is distributed across a large number of clients. Such systems operate in settings with intermittent client availability and/or time-varying communication constraints. As a result, the global models trained by federated learning systems may be biased towards clients with higher availability. We propose F3AST, an unbiased algorithm that dynamically learns an availability-dependent client selection strategy which asymptotically minimizes the impact of client-sampling variance on the global model convergence, enhancing performance of federated learning. The proposed algorithm is tested in a variety of settings for intermittently available clients under communication constraints, and its efficacy demonstrated on synthetic data and realistically federated benchmarking experiments using CIFAR100 and Shakespeare datasets. We show up to 186 improvements over FedAvg, and 8 Shakespeare, respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset