DeepAI AI Chat
Log In Sign Up

Differential Private Discrete Noise Adding Mechanism: Conditions, Properties and Optimization

by   Shuying Qin, et al.
The University of Hong Kong

Differential privacy is a standard framework to quantify the privacy loss in the data anonymization process. To preserve differential privacy, a random noise adding mechanism is widely adopted, where the trade-off between data privacy level and data utility is of great concern. The privacy and utility properties for the continuous noise adding mechanism have been well studied. However, the related works are insufficient for the discrete random mechanism on discretely distributed data, e.g., traffic data, health records. This paper focuses on the discrete random noise adding mechanisms. We study the basic differential privacy conditions and properties for the general discrete random mechanisms, as well as the trade-off between data privacy and data utility. Specifically, we derive a sufficient and necessary condition for discrete epsilon-differential privacy and a sufficient condition for discrete (epsilon, delta)-differential privacy, with the numerical estimation of differential privacy parameters. These conditions can be applied to analyze the differential privacy properties for the discrete noise adding mechanisms with various kinds of noises. Then, with the differential privacy guarantees, we propose an optimal discrete epsilon-differential private noise adding mechanism under the utility-maximization framework, where the utility is characterized by the similarity of the statistical properties between the mechanism's input and output. For this setup, we find that the class of the discrete noise probability distributions in the optimal mechanism is Staircase-shaped.


page 1

page 2

page 3

page 4


Grafting Laplace and Gaussian distributions: A new noise mechanism for differential privacy

The framework of Differential privacy protects an individual's privacy w...

Optimal Noise-Adding Mechanism in Additive Differential Privacy

We derive the optimal (0, δ)-differentially private query-output indepen...

Privacy-Utility Trade-off of Linear Regression under Random Projections and Additive Noise

Data privacy is an important concern in machine learning, and is fundame...

Batching of Tasks by Users of Pseudonymous Forums: Anonymity Compromise and Protection

There are a number of forums where people participate under pseudonyms. ...

The power of private likelihood-ratio tests for goodness-of-fit in frequency tables

Privacy-protecting data analysis investigates statistical methods under ...

Block Design-Based Local Differential Privacy Mechanisms

In this paper, we propose a new class of local differential privacy (LDP...

Duff: A Dataset-Distance-Based Utility Function Family for the Exponential Mechanism

We propose and analyze a general-purpose dataset-distance-based utility ...