Communication-Efficient Federated Learning via Optimal Client Sampling

07/30/2020
by   Mónica Ribero, et al.
0

Federated learning is a private and efficient framework for learning models in settings where data is distributed across many clients. Due to interactive nature of the training process, frequent communication of large amounts of information is required between the clients and the central server which aggregates local models. We propose a novel, simple and efficient way of updating the central model in communication-constrained settings by determining the optimal client sampling policy. In particular, modeling the progression of clients' weights by an Ornstein-Uhlenbeck process allows us to derive the optimal sampling strategy for selecting a subset of clients with significant weight updates. The central server then collects local models from only the selected clients and subsequently aggregates them. We propose four client sampling strategies and test them on two federated learning benchmark tests, namely, a classification task on EMNIST and a realistic language modeling task using the Stackoverflow dataset. The results show that the proposed framework provides significant reduction in communication while maintaining competitive or achieving superior performance compared to baseline. Our methods introduce a new line of communication strategies orthogonal to the existing user-local methods such as quantization or sparsification, thus complementing rather than aiming to replace them.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2023

Communication-Efficient Federated Learning through Importance Sampling

The high communication cost of sending model updates from the clients to...
research
10/26/2020

Optimal Client Sampling for Federated Learning

It is well understood that client-master communication can be a primary ...
research
11/14/2020

CatFedAvg: Optimising Communication-efficiency and Classification Accuracy in Federated Learning

Federated learning has allowed the training of statistical models over r...
research
10/29/2021

ADDS: Adaptive Differentiable Sampling for Robust Multi-Party Learning

Distributed multi-party learning provides an effective approach for trai...
research
02/27/2023

Communication Trade-offs in Federated Learning of Spiking Neural Networks

Spiking Neural Networks (SNNs) are biologically inspired alternatives to...
research
09/17/2021

Comfetch: Federated Learning of Large Networks on Memory-Constrained Clients via Sketching

A popular application of federated learning is using many clients to tra...
research
12/20/2020

Toward Understanding the Influence of Individual Clients in Federated Learning

Federated learning allows mobile clients to jointly train a global model...

Please sign up or login with your details

Forgot password? Click here to reset