PPT: A Privacy-Preserving Global Model Training Protocol for Federated Learning in P2P Networks

05/30/2021
by   Qian Chen, et al.
0

The concept of Federated Learning has emerged as a convergence of distributed machine learning, information, and communication technology. It is vital to the development of distributed machine learning, which is expected to be fully decentralized, robust, communication efficient, and secure. However, the federated learning settings with a central server can't meet requirements in fully decentralized networks. In this paper, we propose a fully decentralized, efficient, and privacy-preserving global model training protocol, named PPT, for federated learning in Peer-to-peer (P2P) Networks. PPT uses a one-hop communication form to aggregate local model update parameters and adopts the symmetric cryptosystem to ensure security. It is worth mentioning that PPT modifies the Eschenauer-Gligor (E-G) scheme to distribute keys for encryption. PPT also adopts Neighborhood Broadcast, Supervision and Report, and Termination as complementary mechanisms to enhance security and robustness. Through extensive analysis, we demonstrate that PPT resists various security threats and preserve user privacy. Ingenious experiments demonstrate the utility and efficiency as well.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset