DeFTA: A Plug-and-Play Decentralized Replacement for FedAvg

04/06/2022
by   Yuhao Zhou, et al.
1

Federated learning (FL) is identified as a crucial enabler for large-scale distributed machine learning (ML) without the need for local raw dataset sharing, substantially reducing privacy concerns and alleviating the isolated data problem. In reality, the prosperity of FL is largely due to a centralized framework called FedAvg, in which workers are in charge of model training and servers are in control of model aggregation. However, FedAvg's centralized worker-server architecture has raised new concerns, be it the low scalability of the cluster, the risk of data leakage, and the failure or even defection of the central server. To overcome these problems, we propose Decentralized Federated Trusted Averaging (DeFTA), a decentralized FL framework that serves as a plug-and-play replacement for FedAvg, instantly bringing better security, scalability, and fault-tolerance to the federated learning process after installation. In principle, it fundamentally resolves the above-mentioned issues from an architectural perspective without compromises or tradeoffs, primarily consisting of a new model aggregating formula with theoretical performance analysis, and a decentralized trust system (DTS) to greatly improve system robustness. Note that since DeFTA is an alternative to FedAvg at the framework level, prevalent algorithms published for FedAvg can be also utilized in DeFTA with ease. Extensive experiments on six datasets and six basic models suggest that DeFTA not only has comparable performance with FedAvg in a more realistic setting, but also achieves great resilience even when 66 of workers are malicious. Furthermore, we also present an asynchronous variant of DeFTA to endow it with more powerful usability.

READ FULL TEXT

page 7

page 9

research
11/07/2022

Over-The-Air Clustered Wireless Federated Learning

Privacy, security, and bandwidth constraints have led to federated learn...
research
08/01/2022

DeFL: Decentralized Weight Aggregation for Cross-silo Federated Learning

Federated learning (FL) is an emerging promising paradigm of privacy-pre...
research
09/20/2020

When Federated Learning Meets Blockchain: A New Distributed Learning Paradigm

Motivated by the advancing computational capabilities of wireless end us...
research
05/30/2022

Confederated Learning: Federated Learning with Decentralized Edge Servers

Federated learning (FL) is an emerging machine learning paradigm that al...
research
09/15/2022

How Much Does It Cost to Train a Machine Learning Model over Distributed Data Sources?

Federated learning (FL) is one of the most appealing alternatives to the...
research
08/18/2020

Deconstructing the Decentralization Trilemma

The vast majority of applications at this moment rely on centralized ser...
research
04/17/2023

Decentralized Learning Made Easy with DecentralizePy

Decentralized learning (DL) has gained prominence for its potential bene...

Please sign up or login with your details

Forgot password? Click here to reset