Infinitely Divisible Noise in the Low Privacy Regime

10/13/2021
by   Rasmus Pagh, et al.
0

Federated learning, in which training data is distributed among users and never shared, has emerged as a popular approach to privacy-preserving machine learning. Cryptographic techniques such as secure aggregation are used to aggregate contributions, like a model update, from all users. A robust technique for making such aggregates differentially private is to exploit infinite divisibility of the Laplace distribution, namely, that a Laplace distribution can be expressed as a sum of i.i.d. noise shares from a Gamma distribution, one share added by each user. However, Laplace noise is known to have suboptimal error in the low privacy regime for ε-differential privacy, where ε > 1 is a large constant. In this paper we present the first infinitely divisible noise distribution for real-valued data that achieves ε-differential privacy and has expected error that decreases exponentially with ε.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2019

Differentially Private M-band Wavelet-Based Mechanisms in Machine Learning Environments

In the post-industrial world, data science and analytics have gained par...
research
06/19/2019

Scalable and Differentially Private Distributed Aggregation in the Shuffled Model

Federated learning promises to make machine learning feasible on distrib...
research
02/25/2021

Discrete Distribution Estimation with Local Differential Privacy: A Comparative Analysis

Local differential privacy is a promising privacy-preserving model for s...
research
05/01/2019

The Podium Mechanism: Improving on the Laplace and Staircase Mechanisms

The Podium mechanism guarantees (ϵ, 0)-differential privacy by sampling ...
research
07/27/2020

Learning discrete distributions: user vs item-level privacy

Much of the literature on differential privacy focuses on item-level pri...
research
06/15/2023

Training generative models from privatized data

Local differential privacy (LDP) is a powerful method for privacy-preser...
research
08/16/2022

Optimal Gamma density to Obfuscate Quantitative data with Added Noise

Protecting the privacy of individuals in a data-set is no less important...

Please sign up or login with your details

Forgot password? Click here to reset