Poisoning Attacks to Local Differential Privacy Protocols for Key-Value Data

11/22/2021
by   Yongji Wu, et al.
0

Local Differential Privacy (LDP) protocols enable an untrusted server to perform privacy-preserving, federated data analytics. Various LDP protocols have been developed for different types of data such as categorical data, numerical data, and key-value data. Due to their distributed settings, LDP protocols are fundamentally vulnerable to poisoning attacks, in which fake users manipulate the server's analytics results via sending carefully crafted data to the server. However, existing poisoning attacks focused on LDP protocols for simple data types such as categorical and numerical data, leaving the security of LDP protocols for more advanced data types such as key-value data unexplored. In this work, we aim to bridge the gap by introducing novel poisoning attacks to LDP protocols for key-value data. In such a LDP protocol, a server aims to simultaneously estimate the frequency and mean value of each key among some users, each of whom possesses a set of key-value pairs. Our poisoning attacks aim to simultaneously maximize the frequencies and mean values of some attacker-chosen target keys via sending carefully crafted data from some fake users to the sever. Specifically, since our attacks have two objectives, we formulate them as a two-objective optimization problem. Moreover, we propose a method to approximately solve the two-objective optimization problem, from which we obtain the optimal crafted data the fake users should send to the server. We demonstrate the effectiveness of our attacks to three LDP protocols for key-value data both theoretically and empirically. We also explore two defenses against our attacks, which are effective in some scenarios but have limited effectiveness in other scenarios. Our results highlight the needs for new defenses against our poisoning attacks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2019

Data Poisoning Attacks to Local Differential Privacy Protocols

Local Differential Privacy (LDP) protocols enable an untrusted data coll...
research
05/24/2022

Fine-grained Poisoning Attacks to Local Differential Privacy Protocols for Mean and Variance Estimation

Local differential privacy (LDP) protects individual data contributors a...
research
02/18/2021

Data Poisoning Attacks and Defenses to Crowdsourcing Systems

A key challenge of big data analytics is how to collect a large volume o...
research
03/16/2022

MPAF: Model Poisoning Attacks to Federated Learning based on Fake Clients

Existing model poisoning attacks to federated learning assume that an at...
research
05/15/2019

Secure and Utility-Aware Data Collection with Condensed Local Differential Privacy

Local Differential Privacy (LDP) is popularly used in practice for priva...
research
09/04/2022

On the Risks of Collecting Multidimensional Data Under Local Differential Privacy

The private collection of multiple statistics from a population is a fun...
research
05/20/2019

Consistent and Accurate Frequency Oracles under Local Differential Privacy

Local Differential Privacy (LDP) protects user privacy from the data col...

Please sign up or login with your details

Forgot password? Click here to reset