Communication Efficient Private Federated Learning Using Dithering

09/14/2023
by   Burak Hasircioglu, et al.
0

The task of preserving privacy while ensuring efficient communication is a fundamental challenge in federated learning. In this work, we tackle this challenge in the trusted aggregator model, and propose a solution that achieves both objectives simultaneously. We show that employing a quantization scheme based on subtractive dithering at the clients can effectively replicate the normal noise addition process at the aggregator. This implies that we can guarantee the same level of differential privacy against other clients while substantially reducing the amount of communication required, as opposed to transmitting full precision gradients and using central noise addition. We also experimentally demonstrate that the accuracy of our proposed approach matches that of the full precision gradient method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/01/2020

Sparsified Privacy-Masking for Communication-Efficient and Privacy-Preserving Federated Learning

Federated learning has received significant interests recently due to it...
research
04/26/2021

A Graph Federated Architecture with Privacy Preserving Learning

Federated learning involves a central processor that works with multiple...
research
06/25/2023

Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters

Federated learning collaboratively trains a neural network on privately ...
research
10/22/2022

Mixed Precision Quantization to Tackle Gradient Leakage Attacks in Federated Learning

Federated Learning (FL) enables collaborative model building among a lar...
research
07/20/2021

Precision-Weighted Federated Learning

Federated Learning using the Federated Averaging algorithm has shown gre...
research
07/17/2020

Asynchronous Federated Learning with Reduced Number of Rounds and with Differential Privacy from Less Aggregated Gaussian Noise

The feasibility of federated learning is highly constrained by the serve...
research
05/13/2022

OFedQIT: Communication-Efficient Online Federated Learning via Quantization and Intermittent Transmission

Online federated learning (OFL) is a promising framework to collaborativ...

Please sign up or login with your details

Forgot password? Click here to reset