DeepAI AI Chat
Log In Sign Up

Active Federated Learning

by   Jack Goetz, et al.

Federated Learning allows for population level models to be trained without centralizing client data by transmitting the global model to clients, calculating gradients locally, then averaging the gradients. Downloading models and uploading gradients uses the client's bandwidth, so minimizing these transmission costs is important. The data on each client is highly variable, so the benefit of training on different clients may differ dramatically. To exploit this we propose Active Federated Learning, where in each round clients are selected not uniformly at random, but with a probability conditioned on the current model and the data on the client to maximize efficiency. We propose a cheap, simple and intuitive sampling scheme which reduces the number of required training iterations by 20-70 accuracy, and which mimics well known resampling techniques under certain conditions.


page 1

page 2

page 3

page 4


FilFL: Accelerating Federated Learning via Client Filtering

Federated learning is an emerging machine learning paradigm that enables...

Communication-Efficient Federated Learning via Robust Distributed Mean Estimation

Federated learning commonly relies on algorithms such as distributed (mi...

Federated Learning Under Intermittent Client Availability and Time-Varying Communication Constraints

Federated learning systems facilitate training of global models in setti...

Gradient Masked Federated Optimization

Federated Averaging (FedAVG) has become the most popular federated learn...

FedRN: Exploiting k-Reliable Neighbors Towards Robust Federated Learning

Robustness is becoming another important challenge of federated learning...

Robust Federated Learning Through Representation Matching and Adaptive Hyper-parameters

Federated learning is a distributed, privacy-aware learning scenario whi...

Global Convergence of Federated Learning for Mixed Regression

This paper studies the problem of model training under Federated Learnin...