MYSTIKO : : Cloud-Mediated, Private, Federated Gradient Descent

12/01/2020
by   K. R. Jayaram, et al.
0

Federated learning enables multiple, distributed participants (potentially on different clouds) to collaborate and train machine/deep learning models by sharing parameters/gradients. However, sharing gradients, instead of centralizing data, may not be as private as one would expect. Reverse engineering attacks on plaintext gradients have been demonstrated to be practically feasible. Existing solutions for differentially private federated learning, while promising, lead to less accurate models and require nontrivial hyperparameter tuning. In this paper, we examine the use of additive homomorphic encryption (specifically the Paillier cipher) to design secure federated gradient descent techniques that (i) do not require addition of statistical noise or hyperparameter tuning, (ii) does not alter the final accuracy or utility of the final model, (iii) ensure that the plaintext model parameters/gradients of a participant are never revealed to any other participant or third party coordinator involved in the federated learning job, (iv) minimize the trust placed in any third party coordinator and (v) are efficient, with minimal overhead, and cost effective.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2021

Efficient Differentially Private Secure Aggregation for Federated Learning via Hardness of Learning with Errors

Federated machine learning leverages edge computing to develop models fr...
research
11/25/2020

Distributed Additive Encryption and Quantization for Privacy Preserving Federated Deep Learning

Homomorphic encryption is a very useful gradient protection technique us...
research
11/10/2020

Compression Boosts Differentially Private Federated Learning

Federated Learning allows distributed entities to train a common model c...
research
10/09/2020

Voting-based Approaches For Differentially Private Federated Learning

While federated learning (FL) enables distributed agents to collaborativ...
research
04/27/2021

Confined Gradient Descent: Privacy-preserving Optimization for Federated Learning

Federated learning enables multiple participants to collaboratively trai...
research
06/01/2022

Optimization with access to auxiliary information

We investigate the fundamental optimization question of minimizing a tar...
research
05/05/2023

Data Station: Delegated, Trustworthy, and Auditable Computation to Enable Data-Sharing Consortia with a Data Escrow

Pooling and sharing data increases and distributes its value. But since ...

Please sign up or login with your details

Forgot password? Click here to reset