A General Theory for Federated Optimization with Asynchronous and Heterogeneous Clients Updates

06/21/2022
by   Yann Fraboni, et al.
0

We propose a novel framework to study asynchronous federated learning optimization with delays in gradient updates. Our theoretical framework extends the standard FedAvg aggregation scheme by introducing stochastic aggregation weights to represent the variability of the clients update time, due for example to heterogeneous hardware capabilities. Our formalism applies to the general federated setting where clients have heterogeneous datasets and perform at least one step of stochastic gradient descent (SGD). We demonstrate convergence for such a scheme and provide sufficient conditions for the related minimum to be the optimum of the federated problem. We show that our general framework applies to existing optimization schemes including centralized learning, FedAvg, asynchronous FedAvg, and FedBuff. The theory here provided allows drawing meaningful guidelines for designing a federated learning experiment in heterogeneous conditions. In particular, we develop in this work FedFix, a novel extension of FedAvg enabling efficient asynchronous federated training while preserving the convergence stability of synchronous aggregation. We empirically demonstrate our theory on a series of experiments showing that asynchronous FedAvg leads to fast convergence at the expense of stability, and we finally demonstrate the improvements of FedFix over synchronous and asynchronous FedAvg.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2023

CSMAAFL: Client Scheduling and Model Aggregation in Asynchronous Federated Learning

Asynchronous federated learning aims to solve the straggler problem in h...
research
10/03/2022

Unbounded Gradients in Federated Learning with Buffered Asynchronous Aggregation

Synchronous updates may compromise the efficiency of cross-device federa...
research
02/16/2023

Distributed Learning in Heterogeneous Environment: federated learning with adaptive aggregation and computation reduction

Although federated learning has achieved many breakthroughs recently, th...
research
05/27/2023

Privacy-Preserving Model Aggregation for Asynchronous Federated Learning

We present a novel privacy-preserving model aggregation for asynchronous...
research
12/15/2021

Blockchain-enabled Server-less Federated Learning

Motivated by the heterogeneous nature of devices participating in large-...
research
10/05/2021

Secure Aggregation for Buffered Asynchronous Federated Learning

Federated learning (FL) typically relies on synchronous training, which ...
research
07/18/2021

Federated Action Recognition on Heterogeneous Embedded Devices

Federated learning allows a large number of devices to jointly learn a m...

Please sign up or login with your details

Forgot password? Click here to reset