A Theoretical Perspective on Differentially Private Federated Multi-task Learning

11/14/2020
by   Huiwen Wu, et al.
0

In the era of big data, the need to expand the amount of data through data sharing to improve model performance has become increasingly compelling. As a result, effective collaborative learning models need to be developed with respect to both privacy and utility concerns. In this work, we propose a new federated multi-task learning method for effective parameter transfer with differential privacy to protect gradients at the client level. Specifically, the lower layers of the networks are shared across all clients to capture transferable feature representation, while top layers of the network are task-specific for on-client personalization. Our proposed algorithm naturally resolves the statistical heterogeneity problem in federated networks. We are, to the best of knowledge, the first to provide both privacy and utility guarantees for such a proposed federated algorithm. The convergences are proved for the cases with Lipschitz smooth objective functions under the non-convex, convex, and strongly convex settings. Empirical experiment results on different datasets have been conducted to demonstrate the effectiveness of the proposed algorithm and verify the implications of the theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2017

Differentially Private Federated Learning: A Client Level Perspective

Federated learning is a recent advance in privacy protection. In this co...
research
01/03/2023

Differentially Private Federated Clustering over Non-IID Data

Federated clustering (FedC) is an adaptation of centralized clustering i...
research
07/17/2022

Multi-Task and Transfer Learning for Federated Learning Applications

Federated learning enables many applications benefiting distributed and ...
research
06/10/2023

Differentially private sliced inverse regression in the federated paradigm

We extend the celebrated sliced inverse regression to address the challe...
research
03/13/2022

Private Non-Convex Federated Learning Without a Trusted Server

We study differentially private (DP) federated learning (FL) with non-co...
research
09/12/2019

Differentially Private Meta-Learning

Parameter-transfer is a well-known and versatile approach for meta-learn...
research
03/17/2023

Multi-Task Model Personalization for Federated Supervised SVM in Heterogeneous Networks

In this paper, we design an efficient distributed iterative learning met...

Please sign up or login with your details

Forgot password? Click here to reset