SoteriaFL: A Unified Framework for Private Federated Learning with Communication Compression

06/20/2022
by   Zhize Li, et al.
1

To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especially at the client level, is another important desideratum that has not been addressed simultaneously in the presence of advanced communication compression techniques yet. In this paper, we propose a unified framework that enhances the communication efficiency of private federated learning with communication compression. Exploiting both general compression operators and local differential privacy, we first examine a simple algorithm that applies compression directly to differentially-private stochastic gradient descent, and identify its limitations. We then propose a unified framework SoteriaFL for private federated learning, which accommodates a general family of local gradient estimators including popular stochastic variance-reduced gradient methods and the state-of-the-art shifted compression scheme. We provide a comprehensive characterization of its performance trade-offs in terms of privacy, utility, and communication complexity, where SoteraFL is shown to achieve better communication complexity without sacrificing privacy nor utility than other private federated learning algorithms without communication compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/28/2021

Differential Privacy Meets Federated Learning under Communication Constraints

The performance of federated learning systems is bottlenecked by communi...
research
03/15/2022

Privacy-Aware Compression for Federated Data Analysis

Federated data analytics is a framework for distributed data analysis wh...
research
10/15/2022

Sketching for First Order Method: Efficient Algorithm for Low-Bandwidth Channel and Vulnerability

Sketching is one of the most fundamental tools in large-scale machine le...
research
07/20/2023

Private Federated Learning with Autotuned Compression

We propose new techniques for reducing communication in private federate...
research
12/08/2018

No Peek: A Survey of private distributed deep learning

We survey distributed deep learning models for training or inference wit...
research
06/02/2022

Impact of Sampling on Locally Differentially Private Data Collection

With the recent bloom of data, there is a huge surge in threats against ...
research
02/24/2021

Lossless Compression of Efficient Private Local Randomizers

Locally Differentially Private (LDP) Reports are commonly used for colle...

Please sign up or login with your details

Forgot password? Click here to reset