Secure Federated Submodel Learning

11/06/2019
by   Chaoyue Niu, et al.
0

Federated learning was proposed with an intriguing vision of achieving collaborative machine learning among numerous clients without uploading their private data to a cloud server. However, the conventional framework requires each client to leverage the full model for learning, which can be prohibitively inefficient for resource-constrained clients and large-scale deep learning tasks. We thus propose a new framework, called federated submodel learning, where clients download only the needed parts of the full model, namely submodels, and then upload the submodel updates. Nevertheless, the "position" of a client's truly required submodel corresponds to her private data, and its disclosure to the cloud server during interactions inevitably breaks the tenet of federated learning. To integrate efficiency and privacy, we have designed a secure federated submodel learning scheme coupled with a private set union protocol as a cornerstone. Our secure scheme features the properties of randomized response, secure aggregation, and Bloom filter, and endows each client with a customized plausible deniability, in terms of local differential privacy, against the position of her desired submodel, thus protecting her private data. We further instantiated our scheme with the e-commerce recommendation scenario in Alibaba, implemented a prototype system, and extensively evaluated its performance over 30-day Taobao user data. The analysis and evaluation results demonstrate the feasibility and scalability of our scheme from model accuracy and convergency, practical communication, computation, and storage overheads, as well as manifest its remarkable advantages over the conventional federated learning framework.

READ FULL TEXT
research
02/20/2022

Collusion Resistant Federated Learning with Oblivious Distributed Differential Privacy

Privacy-preserving federated learning enables a population of distribute...
research
10/12/2020

Differentially Private Secure Multi-Party Computation for Federated Learning in Financial Applications

Federated Learning enables a population of clients, working with a trust...
research
06/18/2022

Fully Privacy-Preserving Federated Representation Learning via Secure Embedding Aggregation

We consider a federated representation learning framework, where with th...
research
01/18/2023

Private Federated Submodel Learning via Private Set Union

We consider the federated submodel learning (FSL) problem and propose an...
research
07/07/2021

RoFL: Attestable Robustness for Secure Federated Learning

Federated Learning is an emerging decentralized machine learning paradig...
research
06/12/2023

AnoFel: Supporting Anonymity for Privacy-Preserving Federated Learning

Federated learning enables users to collaboratively train a machine lear...
research
11/24/2021

Efficient Secure Aggregation Based on SHPRG For Federated Learning

We propose a novel secure aggregation scheme based on seed-homomorphic p...

Please sign up or login with your details

Forgot password? Click here to reset