Quantifying Differential Privacy in Continuous Data Release under Temporal Correlations
Differential Privacy (DP) has received increasing attention as a rigorous privacy framework. Many existing studies employ traditional DP mechanisms (e.g., the Laplace mechanism) as primitives to continuously release private data for protecting privacy at each time point (i.e., event-level privacy), which assume that the data at different time points are independent, or that adversaries do not have knowledge of correlation between data. However, continuously generated data tend to be temporally correlated, and such correlations can be acquired by adversaries. In this paper, we investigate the potential privacy loss of a traditional DP mechanism under temporal correlations. First, we analyze the privacy leakage of a DP mechanism when adversaries have knowledge of such temporal correlations. Our analysis reveals that, the event-level privacy loss of a DP mechanism may increase over time, while the user-level privacy is as expected. We call the unexpected privacy loss temporal privacy leakage (TPL). Second, we design efficient algorithms for quantifying TPL. Although TPL may increase over time, we find that its supremum may exist in some cases. Third, we propose mechanisms that convert any existing DP mechanism into one against temporal privacy leakage. Experiments confirm that our approach is efficient and effective.
READ FULL TEXT