Computing Exact Guarantees for Differential Privacy

06/07/2019
by   Antti Koskela, et al.
0

Quantification of the privacy loss associated with a randomised algorithm has become an active area of research and (ε,δ)-differential privacy has arisen as the standard measure of it. We propose a numerical method for evaluating the parameters of differential privacy for algorithms with continuous one dimensional output. In this way the parameters ε and δ can be evaluated, for example, for the subsampled multidimensional Gaussian mechanism which is also the underlying mechanism of differentially private stochastic gradient descent. The proposed method is based on a numerical approximation of an integral formula which gives the exact (ε,δ)-values. The approximation is carried out by discretising the integral and by evaluating discrete convolutions using a fast Fourier transform algorithm. We give theoretical error bounds which show the convergence of the approximation and guarantee its accuracy to an arbitrary degree. Experimental comparisons with state-of-the-art techniques illustrate the efficacy of the method. Python code for the proposed method can be found in Github (https://github.com/DPBayes/PLD-Accountant/).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/16/2020

A Better Bound Gives a Hundred Rounds: Enhanced Privacy Guarantees via f-Divergences

We derive the optimal differential privacy (DP) parameters of a mechanis...
research
03/08/2020

Removing Disparate Impact of Differentially Private Stochastic Gradient Descent on Model Accuracy

When we enforce differential privacy in machine learning, the utility-pr...
research
06/12/2020

Tight Approximate Differential Privacy for Discrete-Valued Mechanisms Using FFT

We propose a numerical accountant for evaluating the tight (ε,δ)-privacy...
research
02/24/2021

Computing Differential Privacy Guarantees for Heterogeneous Compositions Using FFT

The recently proposed Fast Fourier Transform (FFT)-based accountant for ...
research
03/31/2020

The Discrete Gaussian for Differential Privacy

We show how to efficiently provide differentially private answers to cou...
research
02/02/2022

Exact Privacy Analysis of the Gaussian Sparse Histogram Mechanism

Sparse histogram methods can be useful for returning differentially priv...
research
07/29/2022

Content-Aware Differential Privacy with Conditional Invertible Neural Networks

Differential privacy (DP) has arisen as the gold standard in protecting ...

Please sign up or login with your details

Forgot password? Click here to reset