How to Make Your Approximation Algorithm Private: A Black-Box Differentially-Private Transformation for Tunable Approximation Algorithms of Functions with Low Sensitivity

10/07/2022
by   Jeremiah Blocki, et al.
0

We develop a framework for efficiently transforming certain approximation algorithms into differentially-private variants, in a black-box manner. Our results focus on algorithms A that output an approximation to a function f of the form (1-a)f(x)-k <= A(x) <= (1+a)f(x)+k, where 0<=a <1 is a parameter that can be“tuned" to small-enough values while incurring only a poly blowup in the running time/space. We show that such algorithms can be made DP without sacrificing accuracy, as long as the function f has small global sensitivity. We achieve these results by applying the smooth sensitivity framework developed by Nissim, Raskhodnikova, and Smith (STOC 2007). Our framework naturally applies to transform non-private FPRAS (resp. FPTAS) algorithms into (ϵ,δ)-DP (resp. ϵ-DP) approximation algorithms. We apply our framework in the context of sublinear-time and sublinear-space algorithms, while preserving the nature of the algorithm in meaningful ranges of the parameters. Our results include the first (to the best of our knowledge) (ϵ,δ)-edge DP sublinear-time algorithm for estimating the number of triangles, the number of connected components, and the weight of a MST of a graph, as well as a more efficient algorithm (while sacrificing pure DP in contrast to previous results) for estimating the average degree of a graph. In the area of streaming algorithms, our results include (ϵ,δ)-DP algorithms for estimating L_p-norms, distinct elements, and weighted MST for both insertion-only and turnstile streams. Our transformation also provides a private version of the smooth histogram framework, which is commonly used for converting streaming algorithms into sliding window variants, and achieves a multiplicative approximation to many problems, such as estimating L_p-norms, distinct elements, and the length of the longest increasing subsequence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2022

Privately Estimating Graph Parameters in Sublinear time

We initiate a systematic study of algorithms that are both differentiall...
research
11/07/2022

Additive Noise Mechanisms for Making Randomized Approximation Algorithms Differentially Private

The exponential increase in the amount of available data makes taking ad...
research
07/14/2023

Differentially Private Clustering in Data Streams

The streaming model is an abstraction of computing over massive data str...
research
06/01/2022

Bring Your Own Algorithm for Optimal Differentially Private Stochastic Minimax Optimization

We study differentially private (DP) algorithms for smooth stochastic mi...
research
10/11/2021

Differentially Private Approximate Quantiles

In this work we study the problem of differentially private (DP) quantil...
research
02/22/2023

Differentially Private L_2-Heavy Hitters in the Sliding Window Model

The data management of large companies often prioritize more recent data...
research
04/08/2023

A Unified Characterization of Private Learnability via Graph Theory

We provide a unified framework for characterizing pure and approximate d...

Please sign up or login with your details

Forgot password? Click here to reset