Decentralized Federated Averaging

04/23/2021
by   Tao Sun, et al.
11

Federated averaging (FedAvg) is a communication efficient algorithm for the distributed training with an enormous number of clients. In FedAvg, clients keep their data locally for privacy protection; a central parameter server is used to communicate between clients. This central server distributes the parameters to each client and collects the updated parameters from clients. FedAvg is mostly studied in centralized fashions, which requires massive communication between server and clients in each communication. Moreover, attacking the central server can break the whole system's privacy. In this paper, we study the decentralized FedAvg with momentum (DFedAvgM), which is implemented on clients that are connected by an undirected graph. In DFedAvgM, all clients perform stochastic gradient descent with momentum and communicate with their neighbors only. To further reduce the communication cost, we also consider the quantized DFedAvgM. We prove convergence of the (quantized) DFedAvgM under trivial assumptions; the convergence rate can be improved when the loss function satisfies the PŁ property. Finally, we numerically verify the efficacy of DFedAvgM.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/24/2022

Robust Federated Learning with Connectivity Failures: A Semi-Decentralized Framework with Collaborative Relaying

Intermittent client connectivity is one of the major challenges in centr...
06/28/2021

Chat Room Using HTML, PHP, CSS, JS, AJAX

Earlier there was no mode of online communication between users. In big ...
01/06/2022

Federated Optimization of Smooth Loss Functions

In this work, we study empirical risk minimization (ERM) within a federa...
06/14/2021

Federated Myopic Community Detection with One-shot Communication

In this paper, we study the problem of recovering the community structur...
09/24/2021

Paving the Way for Distributed ArtificialIntelligence over the Air

Distributed Artificial Intelligence (DAI) is regarded as one of the most...
02/04/2021

Improved Communication Efficiency for Distributed Mean Estimation with Side Information

In this paper, we consider the distributed mean estimation problem where...
05/06/2022

Network Gradient Descent Algorithm for Decentralized Federated Learning

We study a fully decentralized federated learning algorithm, which is a ...