cpSGD: Communication-efficient and differentially-private distributed SGD

05/27/2018
by   Naman Agarwal, et al.
0

Distributed stochastic gradient descent is an important subroutine in distributed learning. A setting of particular interest is when the clients are mobile devices, where two important concerns are communication efficiency and the privacy of the clients. Several recent works have focused on reducing the communication cost or introducing privacy guarantees, but none of the proposed communication efficient methods are known to be privacy preserving and none of the known privacy mechanisms are known to be communication efficient. To this end, we study algorithms that achieve both communication efficiency and differential privacy. For d variables and n ≈ d clients, the proposed method uses O((nd)) bits of communication per client per coordinate and ensures constant privacy. We also extend and improve previous analysis of the Binomial mechanism showing that it achieves nearly the same utility as the Gaussian mechanism, while requiring fewer representation bits, which can be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2022

Shuffled Check-in: Privacy Amplification towards Practical Distributed Learning

Recent studies of distributed computation with formal privacy guarantees...
research
10/30/2019

Efficient Privacy-Preserving Nonconvex Optimization

While many solutions for privacy-preserving convex empirical risk minimi...
research
10/03/2022

β-Stochastic Sign SGD: A Byzantine Resilient and Differentially Private Gradient Compressor for Federated Learning

Federated Learning (FL) is a nascent privacy-preserving learning framewo...
research
01/12/2020

Private and Communication-Efficient Edge Learning: A Sparse Differential Gaussian-Masking Distributed SGD Approach

With rise of machine learning (ML) and the proliferation of smart mobile...
research
04/26/2023

Killing Two Birds with One Stone: Quantization Achieves Privacy in Distributed Learning

Communication efficiency and privacy protection are two critical issues ...
research
11/03/2019

Privacy for Free: Communication-Efficient Learning with Differential Privacy Using Sketches

Communication and privacy are two critical concerns in distributed learn...
research
11/27/2018

LEASGD: an Efficient and Privacy-Preserving Decentralized Algorithm for Distributed Learning

Distributed learning systems have enabled training large-scale models ov...

Please sign up or login with your details

Forgot password? Click here to reset