Additive Noise Mechanisms for Making Randomized Approximation Algorithms Differentially Private

11/07/2022
by   Jakub Tětek, et al.
0

The exponential increase in the amount of available data makes taking advantage of them without violating users' privacy one of the fundamental problems of computer science for the 21st century. This question has been investigated thoroughly under the framework of differential privacy. However, most of the literature has not focused on settings where the amount of data is so large that we are not able to compute the exact answer in the non-private setting (such as in the streaming setting, sublinear-time setting, etc.). This can often make the use of differential privacy unfeasible in practice. In this paper, we show a general approach for making Monte-Carlo randomized approximation algorithms differentially private. We only need to assume the error R of the approximation algorithm is sufficiently concentrated around 0 (e.g. E[|R|] is bounded) and that the function being approximated has a small global sensitivity Δ. First, we show that if the error is subexponential, then the Laplace mechanism with error magnitude proportional to the sum of Δ and the subexponential diameter of the error of the algorithm makes the algorithm differentially private. This is true even if the worst-case global sensitivity of the algorithm is large or even infinite. We then introduce a new additive noise mechanism, which we call the zero-symmetric Pareto mechanism. We show that using this mechanism, we can make an algorithm differentially private even if we only assume a bound on the first absolute moment of the error E[|R|]. Finally, we use our results to give the first differentially private algorithms for various problems. This includes results for frequency moments, estimating the average degree of a graph in sublinear time, or estimating the size of the maximum matching. Our results raise many new questions; we state multiple open problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2022

Privately Estimating Graph Parameters in Sublinear time

We initiate a systematic study of algorithms that are both differentiall...
research
01/06/2023

Better Differentially Private Approximate Histograms and Heavy Hitters using the Misra-Gries Sketch

We consider the problem of computing differentially private approximate ...
research
06/08/2017

Pain-Free Random Differential Privacy with Sensitivity Sampling

Popular approaches to differential privacy, such as the Laplace and expo...
research
08/30/2022

Dynamic Global Sensitivity for Differentially Private Contextual Bandits

Bandit algorithms have become a reference solution for interactive recom...
research
03/15/2021

A Central Limit Theorem for Differentially Private Query Answering

Perhaps the single most important use case for differential privacy is t...
research
02/28/2018

INSPECTRE: Privately Estimating the Unseen

We develop differentially private methods for estimating various distrib...

Please sign up or login with your details

Forgot password? Click here to reset