DeepAI AI Chat
Log In Sign Up

LDP-FL: Practical Private Aggregation in Federated Learning with Local Differential Privacy

07/31/2020
by   Lichao Sun, et al.
SAMSUNG
University of Illinois at Chicago
0

Train machine learning models on sensitive user data has raised increasing privacy concerns in many areas. Federated learning is a popular approach for privacy protection that collects the local gradient information instead of real data. One way to achieve a strict privacy guarantee is to apply local differential privacy into federated learning. However, previous works do not give a practical solution due to three issues. First, the noisy data is close to its original value with high probability, increasing the risk of information exposure. Second, a large variance is introduced to the estimated average, causing poor accuracy. Last, the privacy budget explodes due to the high dimensionality of weights in deep learning models. In this paper, we proposed a novel design of local differential privacy mechanism for federated learning to address the abovementioned issues. It is capable of making the data more distinct from its original value and introducing lower variance. Moreover, the proposed mechanism bypasses the curse of dimensionality by splitting and shuffling model updates. A series of empirical evaluations on three commonly used datasets, MNIST, Fashion-MNIST and CIFAR-10, demonstrate that our solution can not only achieve superior deep learning performance but also provide a strong privacy guarantee at the same time.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/03/2018

Protection Against Reconstruction and Its Applications in Private Federated Learning

Federated learning has become an exciting direction for both research an...
09/28/2022

Momentum Gradient Descent Federated Learning with Local Differential Privacy

Nowadays, the development of information technology is growing rapidly. ...
08/23/2021

Federated Learning Meets Fairness and Differential Privacy

Deep learning's unprecedented success raises several ethical concerns ra...
09/11/2020

Federated Model Distillation with Noise-Free Differential Privacy

Conventional federated learning directly averaging model weights is only...
03/15/2022

Training a Tokenizer for Free with Private Federated Learning

Federated learning with differential privacy, i.e. private federated lea...
06/05/2020

LDP-Fed: Federated Learning with Local Differential Privacy

This paper presents LDP-Fed, a novel federated learning system with a fo...
07/16/2022

Sotto Voce: Federated Speech Recognition with Differential Privacy Guarantees

Speech data is expensive to collect, and incredibly sensitive to its sou...