Privacy Assessment of Federated Learning using Private Personalized Layers

by   Théo Jourdan, et al.

Federated Learning (FL) is a collaborative scheme to train a learning model across multiple participants without sharing data. While FL is a clear step forward towards enforcing users' privacy, different inference attacks have been developed. In this paper, we quantify the utility and privacy trade-off of a FL scheme using private personalized layers. While this scheme has been proposed as local adaptation to improve the accuracy of the model through local personalization, it has also the advantage to minimize the information about the model exchanged with the server. However, the privacy of such a scheme has never been quantified. Our evaluations using motion sensor dataset show that personalized layers speed up the convergence of the model and slightly improve the accuracy for all users compared to a standard FL scheme while better preventing both attribute and membership inferences compared to a FL scheme using local differential privacy.


page 1

page 2

page 3

page 4


Toward Robustness and Privacy in Federated Learning: Experimenting with Local and Central Differential Privacy

Federated Learning (FL) allows multiple participants to collaboratively ...

PPA: Preference Profiling Attack Against Federated Learning

Federated learning (FL) trains a global model across a number of decentr...

Personalized Federated Learning of Driver Prediction Models for Autonomous Driving

Autonomous vehicles (AVs) must interact with a diverse set of human driv...

Local Differential Privacy for Federated Learning in Industrial Settings

Federated learning (FL) is a collaborative learning approach that has ga...

Collaborative City Digital Twin For Covid-19 Pandemic: A Federated Learning Solution

In this work, we propose a collaborative city digital twin based on FL, ...

Coding for Straggler Mitigation in Federated Learning

We present a novel coded federated learning (FL) scheme for linear regre...

Adversarial Representation Sharing: A Quantitative and Secure Collaborative Learning Framework

The performance of deep learning models highly depends on the amount of ...