Federated Split GANs

07/04/2022
by   Pranvera Kortoci, et al.
0

Mobile devices and the immense amount and variety of data they generate are key enablers of machine learning (ML)-based applications. Traditional ML techniques have shifted toward new paradigms such as federated (FL) and split learning (SL) to improve the protection of user's data privacy. However, these paradigms often rely on server(s) located in the edge or cloud to train computationally-heavy parts of a ML model to avoid draining the limited resource on client devices, resulting in exposing device data to such third parties. This work proposes an alternative approach to train computationally-heavy ML models in user's devices themselves, where corresponding device data resides. Specifically, we focus on GANs (generative adversarial networks) and leverage their inherent privacy-preserving attribute. We train the discriminative part of a GAN with raw data on user's devices, whereas the generative model is trained remotely (e.g., server) for which there is no need to access sensor true data. Moreover, our approach ensures that the computational load of training the discriminative model is shared among user's devices-proportional to their computation capabilities-by means of SL. We implement our proposed collaborative training scheme of a computationally-heavy GAN model in real resource-constrained devices. The results show that our system preserves data privacy, keeps a short training time, and yields same accuracy of model training in unconstrained devices (e.g., cloud). Our code can be found on https://github.com/YukariSonz/FSL-GAN

READ FULL TEXT
research
09/26/2019

Federated Learning in Mobile Edge Networks: A Comprehensive Survey

In recent years, mobile devices are equipped with increasingly advanced ...
research
06/28/2022

Split Two-Tower Model for Efficient and Privacy-Preserving Cross-device Federated Recommendation

Federated Recommendation can mitigate the systematical privacy risks of ...
research
12/16/2022

SplitGP: Achieving Both Generalization and Personalization in Federated Learning

A fundamental challenge to providing edge-AI services is the need for a ...
research
11/09/2020

SplitEasy: A Practical Approach for Training ML models on Mobile Devices in a split second

Modern mobile devices, although resourceful, cannot train state-of-the-a...
research
11/06/2020

FedSL: Federated Split Learning on Distributed Sequential Data in Recurrent Neural Networks

Federated Learning (FL) and Split Learning (SL) are privacy-preserving M...
research
09/13/2021

AMI-FML: A Privacy-Preserving Federated Machine Learning Framework for AMI

Machine learning (ML) based smart meter data analytics is very promising...
research
11/11/2020

ShadowNet: A Secure and Efficient System for On-device Model Inference

On-device machine learning (ML) is getting more and more popular as fast...

Please sign up or login with your details

Forgot password? Click here to reset