Faithful Edge Federated Learning: Scalability and Privacy

06/30/2021
by   Meng Zhang, et al.
0

Federated learning enables machine learning algorithms to be trained over a network of multiple decentralized edge devices without requiring the exchange of local datasets. Successfully deploying federated learning requires ensuring that agents (e.g., mobile devices) faithfully execute the intended algorithm, which has been largely overlooked in the literature. In this study, we first use risk bounds to analyze how the key feature of federated learning, unbalanced and non-i.i.d. data, affects agents' incentives to voluntarily participate and obediently follow traditional federated learning algorithms. To be more specific, our analysis reveals that agents with less typical data distributions and relatively more samples are more likely to opt out of or tamper with federated learning algorithms. To this end, we formulate the first faithful implementation problem of federated learning and design two faithful federated learning mechanisms which satisfy economic properties, scalability, and privacy. Further, the time complexity of computing all agents' payments in the number of agents is 𝒪(1). First, we design a Faithful Federated Learning (FFL) mechanism which approximates the Vickrey-Clarke-Groves (VCG) payments via an incremental computation. We show that it achieves (probably approximate) optimality, faithful implementation, voluntary participation, and some other economic properties (such as budget balance). Second, by partitioning agents into several subsets, we present a scalable VCG mechanism approximation. We further design a scalable and Differentially Private FFL (DP-FFL) mechanism, the first differentially private faithful mechanism, that maintains the economic properties. Our mechanism enables one to make three-way performance tradeoffs among privacy, the iterations needed, and payment accuracy loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/04/2019

Towards Federated Learning at Scale: System Design

Federated Learning is a distributed machine learning approach which enab...
research
10/02/2020

Model-Agnostic Round-Optimal Federated Learning via Knowledge Transfer

Federated learning enables multiple parties to collaboratively learn a m...
research
01/08/2021

DiPSeN: Differentially Private Self-normalizing Neural Networks For Adversarial Robustness in Federated Learning

The need for robust, secure and private machine learning is an important...
research
07/10/2022

Mechanisms that Incentivize Data Sharing in Federated Learning

Federated learning is typically considered a beneficial technology which...
research
11/13/2019

Federated and Differentially Private Learning for Electronic Health Records

The use of collaborative and decentralized machine learning techniques s...
research
02/24/2023

From Noisy Fixed-Point Iterations to Private ADMM for Centralized and Federated Learning

We study differentially private (DP) machine learning algorithms as inst...
research
07/27/2023

Samplable Anonymous Aggregation for Private Federated Data Analysis

We revisit the problem of designing scalable protocols for private stati...

Please sign up or login with your details

Forgot password? Click here to reset