Disclosure Risk from Homogeneity Attack in Differentially Private Frequency Distribution

01/01/2021
by   Fang Liu, et al.
0

Homogeneity attack allows adversaries to obtain the exact values on the sensitive attributes for his targets without having to re-identify them from released data. Differential privacy (DP) is a mathematical concept that provides robust privacy guarantee against a wide range of privacy attacks. We propose a measure for disclosure risk from homogeneity attack; and derive closed-form relationships between the privacy loss parameters from DP and the disclosure risk from homogeneity attack when released data are multi-dimensional frequency distributions. The availability of the close-form relationships not only saves time and computational resources spent on calculating the relationships numerically, but also assists understanding of DP and privacy loss parameters by putting the abstract concepts in the context of a concrete privacy attack, and offers a different perspective when it comes to choosing privacy loss parameters and implementing differentially private mechanisms for data sanitization and release in practice. We apply the closed-form mathematical relationships in real-life data sets and demonstrate their consistency with the empirical assessment of the disclosure risk due to homogeneity attack on sanitized data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2018

pMSE Mechanism: Differentially Private Synthetic Data with Maximal Distributional Similarity

We propose a method for the release of differentially private synthetic ...
research
05/01/2020

Exploring Private Federated Learning with Laplacian Smoothing

Federated learning aims to protect data privacy by collaboratively learn...
research
01/16/2022

Visualizing Privacy-Utility Trade-Offs in Differentially Private Data Releases

Organizations often collect private data and release aggregate statistic...
research
09/21/2021

Privacy, Security, and Utility Analysis of Differentially Private CPES Data

Differential privacy (DP) has been widely used to protect the privacy of...
research
01/11/2022

Feature Space Hijacking Attacks against Differentially Private Split Learning

Split learning and differential privacy are technologies with growing po...
research
12/10/2021

Are We There Yet? Timing and Floating-Point Attacks on Differential Privacy Systems

Differential privacy is a de facto privacy framework that has seen adopt...
research
07/09/2021

Sensitivity analysis in differentially private machine learning using hybrid automatic differentiation

In recent years, formal methods of privacy protection such as differenti...

Please sign up or login with your details

Forgot password? Click here to reset