PPT: A Privacy-Preserving Global Model Training Protocol for Federated Learning in P2P Networks

05/30/2021 ∙ by Qian Chen, et al. ∙ 0

The concept of Federated Learning has emerged as a convergence of distributed machine learning, information, and communication technology. It is vital to the development of distributed machine learning, which is expected to be fully decentralized, robust, communication efficient, and secure. However, the federated learning settings with a central server can't meet requirements in fully decentralized networks. In this paper, we propose a fully decentralized, efficient, and privacy-preserving global model training protocol, named PPT, for federated learning in Peer-to-peer (P2P) Networks. PPT uses a one-hop communication form to aggregate local model update parameters and adopts the symmetric cryptosystem to ensure security. It is worth mentioning that PPT modifies the Eschenauer-Gligor (E-G) scheme to distribute keys for encryption. PPT also adopts Neighborhood Broadcast, Supervision and Report, and Termination as complementary mechanisms to enhance security and robustness. Through extensive analysis, we demonstrate that PPT resists various security threats and preserve user privacy. Ingenious experiments demonstrate the utility and efficiency as well.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 4

page 10

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.