Are We There Yet? Timing and Floating-Point Attacks on Differential Privacy Systems

12/10/2021
by   Jiankai Jin, et al.
0

Differential privacy is a de facto privacy framework that has seen adoption in practice via a number of mature software platforms. Implementation of differentially private (DP) mechanisms has to be done carefully to ensure end-to-end security guarantees. In this paper we study two implementation flaws in the noise generation commonly used in DP systems. First we examine the Gaussian mechanism's susceptibility to a floating-point representation attack. The premise of this first vulnerability is similar to the one carried out by Mironov in 2011 against the Laplace mechanism. Our experiments show attack's success against DP algorithms, including deep learners trained using differentially-private stochastic gradient descent. In the second part of the paper we study discrete counterparts of the Laplace and Gaussian mechanisms that were previously proposed to alleviate the shortcomings of floating-point representation of real numbers. We show that such implementations unfortunately suffer from another side channel: a novel timing attack. An observer that can measure the time to draw (discrete) Laplace or Gaussian noise can predict the noise magnitude, which can then be used to recover sensitive attributes. This attack invalidates differential privacy guarantees of systems implementing such mechanisms. We demonstrate that several commonly used, state-of-the-art implementations of differential privacy are susceptible to these attacks. We report success rates up to 92.56 end-to-end timing attacks on private sum protected with discrete Laplace. Finally, we evaluate and suggest partial mitigations.

READ FULL TEXT
research
07/27/2022

Precision-based attacks and interval refining: how to break, then fix, differential privacy on finite computers

Despite being raised as a problem over ten years ago, the imprecision of...
research
07/21/2021

Secure Random Sampling in Differential Privacy

Differential privacy is among the most prominent techniques for preservi...
research
12/09/2019

Implementing the Exponential Mechanism with Base-2 Differential Privacy

Despite excellent theoretical support, Differential Privacy (DP) can sti...
research
11/28/2022

Differentially Private Multivariate Statistics with an Application to Contingency Table Analysis

Differential privacy (DP) has become a rigorous central concept in priva...
research
11/17/2021

Network Generation with Differential Privacy

We consider the problem of generating private synthetic versions of real...
research
12/17/2020

Differential privacy and noisy confidentiality concepts for European population statistics

The paper aims to give an overview of various approaches to statistical ...
research
01/01/2021

Disclosure Risk from Homogeneity Attack in Differentially Private Frequency Distribution

Homogeneity attack allows adversaries to obtain the exact values on the ...

Please sign up or login with your details

Forgot password? Click here to reset