Communication-Efficient Decentralized Federated Learning via One-Bit Compressive Sensing

08/31/2023
by   Shenglong Zhou, et al.
0

Decentralized federated learning (DFL) has gained popularity due to its practicality across various applications. Compared to the centralized version, training a shared model among a large number of nodes in DFL is more challenging, as there is no central server to coordinate the training process. Especially when distributed nodes suffer from limitations in communication or computational resources, DFL will experience extremely inefficient and unstable training. Motivated by these challenges, in this paper, we develop a novel algorithm based on the framework of the inexact alternating direction method (iADM). On one hand, our goal is to train a shared model with a sparsity constraint. This constraint enables us to leverage one-bit compressive sensing (1BCS), allowing transmission of one-bit information among neighbour nodes. On the other hand, communication between neighbour nodes occurs only at certain steps, reducing the number of communication rounds. Therefore, the algorithm exhibits notable communication efficiency. Additionally, as each node selects only a subset of neighbours to participate in the training, the algorithm is robust against stragglers. Additionally, complex items are computed only once for several consecutive steps and subproblems are solved inexactly using closed-form solutions, resulting in high computational efficiency. Finally, numerical experiments showcase the algorithm's effectiveness in both communication and computation.

READ FULL TEXT
research
12/04/2019

Learn Electronic Health Records by Fully Decentralized Federated Learning

Federated learning opens a number of research opportunities due to its h...
research
10/06/2021

Two-Bit Aggregation for Communication Efficient and Differentially Private Federated Learning

In federated learning (FL), a machine learning model is trained on multi...
research
05/03/2022

Efficient and Convergent Federated Learning

Federated learning has shown its advances over the last few years but is...
research
06/01/2022

DisPFL: Towards Communication-Efficient Personalized Federated Learning via Decentralized Sparse Training

Personalized federated learning is proposed to handle the data heterogen...
research
08/19/2021

Communication-Efficient Federated Learning via Robust Distributed Mean Estimation

Federated learning commonly relies on algorithms such as distributed (mi...
research
11/03/2022

Federated Hypergradient Descent

In this work, we explore combining automatic hyperparameter tuning and o...
research
12/26/2021

Fully Decentralized and Federated Low Rank Compressive Sensing

In this work we develop a fully decentralized, federated, and fast solut...

Please sign up or login with your details

Forgot password? Click here to reset