Geometry of Sensitivity: Twice Sampling and Hybrid Clipping in Differential Privacy with Optimal Gaussian Noise and Application to Deep Learning

09/06/2023
by   Hanshen Xiao, et al.
0

We study the fundamental problem of the construction of optimal randomization in Differential Privacy. Depending on the clipping strategy or additional properties of the processing function, the corresponding sensitivity set theoretically determines the necessary randomization to produce the required security parameters. Towards the optimal utility-privacy tradeoff, finding the minimal perturbation for properly-selected sensitivity sets stands as a central problem in DP research. In practice, l_2/l_1-norm clippings with Gaussian/Laplace noise mechanisms are among the most common setups. However, they also suffer from the curse of dimensionality. For more generic clipping strategies, the understanding of the optimal noise for a high-dimensional sensitivity set remains limited. In this paper, we revisit the geometry of high-dimensional sensitivity sets and present a series of results to characterize the non-asymptotically optimal Gaussian noise for Rényi DP (RDP). Our results are both negative and positive: on one hand, we show the curse of dimensionality is tight for a broad class of sensitivity sets satisfying certain symmetry properties; but if, fortunately, the representation of the sensitivity set is asymmetric on some group of orthogonal bases, we show the optimal noise bounds need not be explicitly dependent on either dimension or rank. We also revisit sampling in the high-dimensional scenario, which is the key for both privacy amplification and computation efficiency in large-scale data processing. We propose a novel method, termed twice sampling, which implements both sample-wise and coordinate-wise sampling, to enable Gaussian noises to fit the sensitivity geometry more closely. With closed-form RDP analysis, we prove twice sampling produces asymptotic improvement of the privacy amplification given an additional infinity-norm restriction, especially for small sampling rate.

READ FULL TEXT
research
11/27/2019

Reviewing and Improving the Gaussian Mechanism for Differential Privacy

Differential privacy provides a rigorous framework to quantify data priv...
research
04/30/2021

Improved Matrix Gaussian Mechanism for Differential Privacy

The wide deployment of machine learning in recent years gives rise to a ...
research
08/28/2019

Rényi Differential Privacy of the Sampled Gaussian Mechanism

The Sampled Gaussian Mechanism (SGM)---a composition of subsampling and ...
research
06/04/2023

Less is More: Revisiting Gaussian Mechanism for Differential Privacy

In this paper, we identify that the classic Gaussian mechanism and its v...
research
08/29/2023

The Relative Gaussian Mechanism and its Application to Private Gradient Descent

The Gaussian Mechanism (GM), which consists in adding Gaussian noise to ...
research
01/28/2018

Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms

A common way to protect privacy of sensitive information is to introduce...
research
08/05/2014

Spoke Darts for Efficient High Dimensional Blue Noise Sampling

Blue noise refers to sample distributions that are random and well-space...

Please sign up or login with your details

Forgot password? Click here to reset