FLaPS: Federated Learning and Privately Scaling

09/13/2020
by   Sudipta Paul, et al.
0

Federated learning (FL) is a distributed learning process where the model (weights and checkpoints) is transferred to the devices that posses data rather than the classical way of transferring and aggregating the data centrally. In this way, sensitive data does not leave the user devices. FL uses the FedAvg algorithm, which is trained in the iterative model averaging way, on the non-iid and unbalanced distributed data, without depending on the data quantity. Some issues with the FL are, 1) no scalability, as the model is iteratively trained over all the devices, which amplifies with device drops; 2) security and privacy trade-off of the learning process still not robust enough and 3) overall communication efficiency and the cost are higher. To mitigate these challenges we present Federated Learning and Privately Scaling (FLaPS) architecture, which improves scalability as well as the security and privacy of the system. The devices are grouped into clusters which further gives better privacy scaled turn around time to finish a round of training. Therefore, even if a device gets dropped in the middle of training, the whole process can be started again after a definite amount of time. The data and model both are communicated using differentially private reports with iterative shuffling which provides a better privacy-utility trade-off. We evaluated FLaPS on MNIST, CIFAR10, and TINY-IMAGENET-200 dataset using various CNN models. Experimental results prove FLaPS to be an improved, time and privacy scaled environment having better and comparable after-learning-parameters with respect to the central and FL models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2022

Computational Code-Based Privacy in Coded Federated Learning

We propose a privacy-preserving federated learning (FL) scheme that is r...
research
02/17/2023

Privately Customizing Prefinetuning to Better Match User Data in Federated Learning

In Federated Learning (FL), accessing private client data incurs communi...
research
02/08/2022

APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning

Federated learning (FL) enables training models at different sites and u...
research
09/11/2021

Utility Fairness for the Differentially Private Federated Learning

Federated learning (FL) allows predictive model training on the sensed d...
research
06/08/2021

Fast Federated Learning in the Presence of Arbitrary Device Unavailability

Federated Learning (FL) coordinates with numerous heterogeneous devices ...
research
01/21/2022

FedComm: Federated Learning as a Medium for Covert Communication

Proposed as a solution to mitigate the privacy implications related to t...
research
11/21/2022

DPD-fVAE: Synthetic Data Generation Using Federated Variational Autoencoders With Differentially-Private Decoder

Federated learning (FL) is getting increased attention for processing se...

Please sign up or login with your details

Forgot password? Click here to reset