Efficient and Convergent Federated Learning

05/03/2022
by   Shenglong Zhou, et al.
0

Federated learning has shown its advances over the last few years but is facing many challenges, such as how algorithms save communication resources, how they reduce computational costs, and whether they converge. To address these issues, this paper proposes a new federated learning algorithm (FedGiA) that combines the gradient descent and the inexact alternating direction method of multipliers. It is shown that FedGiA is computation and communication-efficient and convergent linearly under mild conditions.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset