Secure and Efficient Decentralized Federated Learning with Data Representation Protection
Federated learning (FL) is a promising technical support to the vision of ubiquitous artificial intelligence in the sixth generation (6G) wireless communication network. However, traditional FL heavily relies on a trusted centralized server. Besides, FL is vulnerable to poisoning attacks, and the global aggregation of model updates makes the private training data under the risk of being reconstructed. What's more, FL suffers from efficiency problem due to heavy communication cost. Although decentralized FL eliminates the problem of the central dependence of traditional FL, it makes other problems more serious. In this paper, we propose BlockDFL, an efficient fully peer-to-peer (P2P) framework for decentralized FL. It integrates gradient compression and our designed voting mechanism with blockchain to efficiently coordinate multiple peer participants without mutual trust to carry out decentralized FL, while preventing data from being reconstructed according to transmitted model updates. Extensive experiments conducted on two real-world datasets exhibit that BlockDFL obtains competitive accuracy compared to centralized FL and can defend against poisoning attacks while achieving efficiency and scalability. Especially when the proportion of malicious participants is as high as 40 percent, BlockDFL can still preserve the accuracy of FL, which outperforms existing fully decentralized FL frameworks.
READ FULL TEXT