Wireless Federated Learning with Limited Communication and Differential Privacy

06/01/2021
by   Amir Sonee, et al.
21

This paper investigates the role of dimensionality reduction in efficient communication and differential privacy (DP) of the local datasets at the remote users for over-the-air computation (AirComp)-based federated learning (FL) model. More precisely, we consider the FL setting in which clients are prompted to train a machine learning model by simultaneous channel-aware and limited communications with a parameter server (PS) over a Gaussian multiple-access channel (GMAC), so that transmissions sum coherently at the PS globally aware of the channel coefficients. For this setting, an algorithm is proposed based on applying federated stochastic gradient descent (FedSGD) for training the minimum of a given loss function based on the local gradients, Johnson-Lindenstrauss (JL) random projection for reducing the dimension of the local updates, and artificial noise to further aid user's privacy. For this scheme, our results show that the local DP performance is mainly improved due to injecting noise of greater variance on each dimension while keeping the sensitivity of the projected vectors unchanged. This is while the convergence rate is slowed down compared to the case without dimensionality reduction. As the performance outweighs for the slower convergence, the trade-off between privacy and convergence is higher but is shown to lessen in high-dimensional regime yielding almost the same trade-off with much less communication cost.

READ FULL TEXT

page 1

page 2

page 3

page 6

page 11

research
05/15/2020

Efficient Federated Learning over Multiple Access Channel with Differential Privacy Constraints

In this paper, the problem of federated learning (FL) over a multiple ac...
research
05/01/2022

A New Dimensionality Reduction Method Based on Hensel's Compression for Privacy Protection in Federated Learning

Differential privacy (DP) is considered a de-facto standard for protecti...
research
02/09/2021

Federated Learning with Local Differential Privacy: Trade-offs between Privacy, Utility, and Communication

Federated learning (FL) allows to train a massive amount of data private...
research
03/02/2021

Privacy Amplification for Federated Learning via User Sampling and Wireless Aggregation

In this paper, we study the problem of federated learning over a wireles...
research
07/09/2022

The Poisson binomial mechanism for secure and private federated learning

We introduce the Poisson Binomial mechanism (PBM), a discrete differenti...
research
04/16/2022

FedVQCS: Federated Learning via Vector Quantized Compressed Sensing

In this paper, a new communication-efficient federated learning (FL) fra...
research
05/05/2023

Over-the-Air Federated Averaging with Limited Power and Privacy Budgets

To jointly overcome the communication bottleneck and privacy leakage of ...

Please sign up or login with your details

Forgot password? Click here to reset