Efficient Dropout-resilient Aggregation for Privacy-preserving Machine Learning

03/31/2022
by   Ziyao Liu, et al.
0

With the increasing adoption of data-hungry machine learning algorithms, personal data privacy has emerged as one of the key concerns that could hinder the success of digital transformation. As such, Privacy-Preserving Machine Learning (PPML) has received much attention from both academia and industry. However, organizations are faced with the dilemma that, on the one hand, they are encouraged to share data to enhance ML performance, but on the other hand, they could potentially be breaching the relevant data privacy regulations. Practical PPML typically allows multiple participants to individually train their ML models, which are then aggregated to construct a global model in a privacy-preserving manner, e.g., based on multi-party computation or homomorphic encryption. Nevertheless, in most important applications of large-scale PPML, e.g., by aggregating clients' gradients to update a global model for federated learning, such as consumer behavior modeling of mobile application services, some participants are inevitably resource-constrained mobile devices, which may drop out of the PPML system due to their mobility nature. Therefore, the resilience of privacy-preserving aggregation has become an important problem to be tackled. In this paper, we propose a scalable privacy-preserving aggregation scheme that can tolerate dropout by participants at any time, and is secure against both semi-honest and active malicious adversaries by setting proper system parameters. By replacing communication-intensive building blocks with a seed homomorphic pseudo-random generator, and relying on the additive homomorphic property of Shamir secret sharing scheme, our scheme outperforms state-of-the-art schemes by up to 6.37× in runtime and provides a stronger dropout-resilience. The simplicity of our scheme makes it attractive both for implementation and for further improvements.

READ FULL TEXT

page 1

page 16

research
04/05/2020

PrivFL: Practical Privacy-preserving Federated Regressions on High-dimensional Data over Mobile Networks

Federated Learning (FL) enables a large number of users to jointly learn...
research
01/14/2021

Reliability Check via Weight Similarity in Privacy-Preserving Multi-Party Machine Learning

Multi-party machine learning is a paradigm in which multiple participant...
research
11/24/2018

Biscotti: A Ledger for Private and Secure Peer-to-Peer Machine Learning

Centralized solutions for privacy-preserving multi-party ML are becoming...
research
09/21/2020

Privacy-Preserving Machine Learning Training in Aggregation Scenarios

To develop Smart City, the growing popularity of Machine Learning (ML) t...
research
09/01/2019

A Privacy-Preserving, Accountable and Spam-Resilient Geo-Marketplace

Mobile devices with rich features can record videos, traffic parameters ...
research
03/22/2022

SPRITE: A Scalable Privacy-Preserving and Verifiable Collaborative Learning for Industrial IoT

Recently collaborative learning is widely applied to model sensitive dat...
research
05/22/2019

Scaling Pseudonymous Authentication for Large Mobile Systems

The central building block of secure and privacy-preserving Vehicular Co...

Please sign up or login with your details

Forgot password? Click here to reset