Smoothed Differential Privacy

07/04/2021
by   Ao Liu, et al.
0

Differential privacy (DP) is a widely-accepted and widely-applied notion of privacy based on worst-case analysis. Often, DP classifies most mechanisms without external noise as non-private [Dwork et al., 2014], and external noises, such as Gaussian noise or Laplacian noise [Dwork et al., 2006], are introduced to improve privacy. In many real-world applications, however, adding external noise is undesirable and sometimes prohibited. For example, presidential elections often require a deterministic rule to be used [Liu et al., 2020], and small noises can lead to dramatic decreases in the prediction accuracy of deep neural networks, especially the underrepresented classes [Bagdasaryan et al., 2019]. In this paper, we propose a natural extension and relaxation of DP following the worst average-case idea behind the celebrated smoothed analysis [Spielman and Teng, 2004]. Our notion, the smoothed DP, can effectively measure the privacy leakage of mechanisms without external noises under realistic settings. We prove several strong properties of the smoothed DP, including composability, robustness to post-processing and etc. We proved that any discrete mechanism with sampling procedures is more private than what DP predicts. In comparison, many continuous mechanisms with sampling procedures are still non-private under smoothed DP. Experimentally, we first verified that the discrete sampling mechanisms are private in real-world elections. Then, we apply the smoothed DP notion on quantized gradient descent, which indicates some neural networks can be private without adding any extra noises. We believe that these results contribute to the theoretical foundation of realistic privacy measures beyond worst-case analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2023

Privacy-Utility Tradeoff of OLS with Random Projections

We study the differential privacy (DP) of a core ML problem, linear ordi...
research
11/27/2019

Reviewing and Improving the Gaussian Mechanism for Differential Privacy

Differential privacy provides a rigorous framework to quantify data priv...
research
06/05/2021

Numerical Composition of Differential Privacy

We give a fast algorithm to optimally compose privacy guarantees of diff...
research
05/17/2021

Gradient Masking and the Underestimated Robustness Threats of Differential Privacy in Deep Learning

An important problem in deep learning is the privacy and security of neu...
research
06/08/2021

Private Counting from Anonymous Messages: Near-Optimal Accuracy with Vanishing Communication Overhead

Differential privacy (DP) is a formal notion for quantifying the privacy...
research
08/16/2023

Optimizing Noise for f-Differential Privacy via Anti-Concentration and Stochastic Dominance

In this paper, we establish anti-concentration inequalities for additive...
research
01/04/2017

Private Incremental Regression

Data is continuously generated by modern data sources, and a recent chal...

Please sign up or login with your details

Forgot password? Click here to reset