FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning

07/20/2022
by   Yuanhao Xiong, et al.
0

Federated learning (FL) has recently attracted increasing attention from academia and industry, with the ultimate goal of achieving collaborative training under privacy and communication constraints. Existing iterative model averaging based FL algorithms require a large number of communication rounds to obtain a well-performed model due to extremely unbalanced and non-i.i.d data partitioning among different clients. Thus, we propose FedDM to build the global training objective from multiple local surrogate functions, which enables the server to gain a more global view of the loss landscape. In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data through distribution matching. FedDM reduces communication rounds and improves model quality by transmitting more informative and smaller synthesized data compared with unwieldy model weights. We conduct extensive experiments on three image classification datasets, and results show that our method can outperform other FL counterparts in terms of efficiency and model performance. Moreover, we demonstrate that FedDM can be adapted to preserve differential privacy with Gaussian mechanism and train a better model under the same privacy budget.

READ FULL TEXT

page 17

page 18

research
02/02/2021

FedProf: Optimizing Federated Learning with Dynamic Data Profiling

Federated Learning (FL) has shown great potential as a privacy-preservin...
research
06/01/2021

H-FL: A Hierarchical Communication-Efficient and Privacy-Protected Architecture for Federated Learning

The longstanding goals of federated learning (FL) require rigorous priva...
research
07/20/2022

Improving Privacy-Preserving Vertical Federated Learning by Efficient Communication with ADMM

Federated learning (FL) enables distributed devices to jointly train a s...
research
08/12/2021

Dynamic Attention-based Communication-Efficient Federated Learning

Federated learning (FL) offers a solution to train a global machine lear...
research
04/15/2023

Gradient-less Federated Gradient Boosting Trees with Learnable Learning Rates

The privacy-sensitive nature of decentralized datasets and the robustnes...
research
06/21/2023

An Efficient Virtual Data Generation Method for Reducing Communication in Federated Learning

Communication overhead is one of the major challenges in Federated Learn...
research
09/20/2023

Federated Learning with Neural Graphical Models

Federated Learning (FL) addresses the need to create models based on pro...

Please sign up or login with your details

Forgot password? Click here to reset