DP-NormFedAvg: Normalizing Client Updates for Privacy-Preserving Federated Learning

06/13/2021
by   Rudrajit Das, et al.
20

In this paper, we focus on facilitating differentially private quantized communication between the clients and server in federated learning (FL). Towards this end, we propose to have the clients send a private quantized version of only the unit vector along the change in their local parameters to the server, completely throwing away the magnitude information. We call this algorithm and show that it has the same order-wise convergence rate as on smooth quasar-convex functions (an important class of non-convex functions for modeling optimization of deep neural networks), thereby establishing that discarding the magnitude information is not detrimental from an optimization point of view. We also introduce QTDL, a new differentially private quantization mechanism for unit-norm vectors, which we use in . QTDL employs discrete noise having a Laplacian-like distribution on a finite support to provide privacy. We show that under a growth-condition assumption on the per-sample client losses, the extra per-coordinate communication cost in each round incurred due to privacy by our method is 𝒪(1) with respect to the model dimension, which is an improvement over prior work. Finally, we show the efficacy of our proposed method with experiments on fully-connected neural networks trained on CIFAR-10 and Fashion-MNIST.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2022

The Interpolated MVU Mechanism For Communication-efficient Private Federated Learning

We consider private federated learning (FL), where a server aggregates d...
research
02/17/2023

Privately Customizing Prefinetuning to Better Match User Data in Federated Learning

In Federated Learning (FL), accessing private client data incurs communi...
research
03/13/2022

Private Non-Convex Federated Learning Without a Trusted Server

We study differentially private (DP) federated learning (FL) with non-co...
research
05/30/2022

FedAUXfdp: Differentially Private One-Shot Federated Distillation

Federated learning suffers in the case of non-iid local datasets, i.e., ...
research
06/17/2021

Locally Differentially Private Federated Learning: Efficient Algorithms with Tight Risk Bounds

Federated learning (FL) is a distributed learning paradigm in which many...
research
07/17/2020

Asynchronous Federated Learning with Reduced Number of Rounds and with Differential Privacy from Less Aggregated Gaussian Noise

The feasibility of federated learning is highly constrained by the serve...
research
10/03/2022

β-Stochastic Sign SGD: A Byzantine Resilient and Differentially Private Gradient Compressor for Federated Learning

Federated Learning (FL) is a nascent privacy-preserving learning framewo...

Please sign up or login with your details

Forgot password? Click here to reset