Toward Evaluating Re-identification Risks in the Local Privacy Model

by   Takao Murakami, et al.

LDP (Local Differential Privacy) has recently attracted much attention as a metric of data privacy that prevents the inference of personal data from obfuscated data in the local model. However, there are scenarios in which the adversary wants to perform re-identification attacks to link the obfuscated data to users in this model. LDP can cause excessive obfuscation and destroy the utility in these scenarios because it is not designed to directly prevent re-identification. In this paper, we propose a measure of reidentification risks, which we call PIE (Personal Information Entropy). The PIE is designed so that it directly prevents re-identification attacks in the local model. It lower-bounds the lowest possible re-identification error probability (i.e., Bayes error probability) of the adversary. We analyze the relation between LDP and the PIE, and analyze the PIE and utility in distribution estimation for two obfuscation mechanisms providing LDP. Through experiments, we show that when we consider re-identification as a privacy risk, LDP can cause excessive obfuscation and destroy the utility. Then we show that the PIE can be used to guarantee low re-identification risks for the local obfuscation mechanisms while keeping high utility.


page 1

page 2

page 3

page 4


Restricted Local Differential Privacy for Distribution Estimation with High Data Utility

LDP (Local Differential Privacy) has recently attracted much attention a...

A formalization of re-identification in terms of compatible probabilities

Re-identification algorithms are used in data privacy to measure disclos...

Utility-Optimized Local Differential Privacy Mechanisms for Distribution Estimation

LDP (Local Differential Privacy) has been widely studied to estimate sta...

PRIVEE: A Visual Analytic Workflow for Proactive Privacy Risk Inspection of Open Data

Open data sets that contain personal information are susceptible to adve...

Analyzing the Shuffle Model through the Lens of Quantitative Information Flow

Local differential privacy (LDP) is a variant of differential privacy (D...

A Shuffling Framework for Local Differential Privacy

ldp deployments are vulnerable to inference attacks as an adversary can ...

Hiding the start of Brownian motion: towards a Bayesian analysis of privacy for GPS trajectories

The diffusion of GPS sensors and the success of applications for sharing...

Please sign up or login with your details

Forgot password? Click here to reset