On the (Im)Possibility of Estimating Various Notions of Differential Privacy

08/30/2022
by   Daniele Gorla, et al.
0

We analyze to what extent final users can infer information about the level of protection of their data when the data obfuscation mechanism is a priori unknown to him (the so called "black-box" scenario). In particular, we delve into the investigation of various notions of differential privacy (DP), namely epsilon-DP, local DP, and Rényi DP. On one side, we prove that, without any assumption on the underlying distributions, it is not possible to have an algorithm able to infer the level of data protection with provable guarantees. On the other side, we demonstrate that, under reasonable assumptions (namely, Lipschitzness of the involved densities on a closed interval), such guarantees exist and can be achieved by a simple histogram-based estimator. Then, by using one of the best known DP obfuscation mechanisms (namely, the Laplacian one), we test in practice that the theoretical number of samples needed to prove our bound is actually much larger than the real number needed for obtaining satisfactory results. Furthermore, we also see that the estimated epsilon is in practice much closer to the real one w.r.t. what our theorems foresee.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset