Hercules: Boosting the Performance of Privacy-preserving Federated Learning

07/11/2022
∙
by   Guowen Xu, et al.
∙
0
∙

In this paper, we address the problem of privacy-preserving federated neural network training with N users. We present Hercules, an efficient and high-precision training framework that can tolerate collusion of up to N-1 users. Hercules follows the POSEIDON framework proposed by Sav et al. (NDSS'21), but makes a qualitative leap in performance with the following contributions: (i) we design a novel parallel homomorphic computation method for matrix operations, which enables fast Single Instruction and Multiple Data (SIMD) operations over ciphertexts. For the multiplication of two h× h dimensional matrices, our method reduces the computation complexity from O(h^3) to O(h). This greatly improves the training efficiency of the neural network since the ciphertext computation is dominated by the convolution operations; (ii) we present an efficient approximation on the sign function based on the composite polynomial approximation. It is used to approximate non-polynomial functions (i.e., ReLU and max), with the optimal asymptotic complexity. Extensive experiments on various benchmark datasets (BCW, ESR, CREDIT, MNIST, SVHN, CIFAR-10 and CIFAR-100) show that compared with POSEIDON, Hercules obtains up to 4 reduction in the computation and communication cost.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
∙ 09/01/2020

POSEIDON: Privacy-Preserving Federated Neural Network Learning

In this paper, we address the problem of privacy-preserving training and...
research
∙ 03/17/2020

Privacy-preserving Weighted Federated Learning within Oracle-Aided MPC Framework

This paper studies privacy-preserving weighted federated learning within...
research
∙ 01/19/2022

Scotch: An Efficient Secure Computation Framework for Secure Aggregation

Federated learning enables multiple data owners to jointly train a machi...
research
∙ 06/10/2022

Fast Deep Autoencoder for Federated learning

This paper presents a novel, fast and privacy preserving implementation ...
research
∙ 09/03/2020

ESMFL: Efficient and Secure Models for Federated Learning

Deep Neural Networks are widely applied to various domains. The successf...
research
∙ 07/28/2022

Privacy-Preserving Federated Recurrent Neural Networks

We present RHODE, a novel system that enables privacy-preserving trainin...
research
∙ 10/23/2019

Stochastic Channel-Based Federated Learning for Medical Data Privacy Preserving

Artificial neural network has achieved unprecedented success in the medi...

Please sign up or login with your details

Forgot password? Click here to reset