Fine-grained Poisoning Attacks to Local Differential Privacy Protocols for Mean and Variance Estimation

05/24/2022
by   Xiaoguang Li, et al.
0

Local differential privacy (LDP) protects individual data contributors against privacy-probing data aggregation and analytics. Recent work has shown that LDP for some specific data types is vulnerable to data poisoning attacks, which enable the attacker to alter analytical results by injecting carefully-crafted bogus data. In this work, we focus on applying data poisoning attack to unexplored statistical tasks, i.e. mean and variance estimations. In contrast to prior work that aims for overall LDP performance degradation or straightforward attack gain maximization, our attacker can fine-tune the LDP estimated mean/variance to the desired target values and simultaneously manipulate them. To accomplish this goal, we propose two types of data poisoning attacks: input poisoning attack (IPA) and output poisoning attack (OPA). The former is independent of LDP while the latter utilizes the characteristics of LDP, thus being more effective. More intriguingly, we observe a security-privacy consistency where a small ϵ enhances the security of LDP contrary to the previous conclusion of a security-privacy trade-off. We further study the consistency and reveal a more holistic view of the threat landscape of LDP in the presence of data poisoning attacks. We comprehensively evaluate the attacks on three real-world datasets and report their effectiveness for achieving the target values. We also explore defense mechanisms and provide insights into the secure LDP design.

READ FULL TEXT
research
11/22/2021

Poisoning Attacks to Local Differential Privacy Protocols for Key-Value Data

Local Differential Privacy (LDP) protocols enable an untrusted server to...
research
02/18/2023

Differential Aggregation against General Colluding Attackers

Local Differential Privacy (LDP) is now widely adopted in large-scale sy...
research
04/12/2019

Towards Formalizing the GDPR's Notion of Singling Out

There is a significant conceptual gap between legal and mathematical thi...
research
08/14/2019

Aggregating Votes with Local Differential Privacy: Usefulness, Soundness vs. Indistinguishability

Voting plays a central role in bringing crowd wisdom to collective decis...
research
04/14/2023

Pool Inference Attacks on Local Differential Privacy: Quantifying the Privacy Guarantees of Apple's Count Mean Sketch in Practice

Behavioral data generated by users' devices, ranging from emoji use to p...
research
11/05/2019

Data Poisoning Attacks to Local Differential Privacy Protocols

Local Differential Privacy (LDP) protocols enable an untrusted data coll...
research
07/21/2021

Secure Random Sampling in Differential Privacy

Differential privacy is among the most prominent techniques for preservi...

Please sign up or login with your details

Forgot password? Click here to reset