MURS: Practical and Robust Privacy Amplification with Multi-Party Differential Privacy

08/30/2019 ∙ by Tianhao Wang, et al. ∙ 0

When collecting information, local differential privacy (LDP) alleviates privacy concerns of users because their private information is randomized before being sent to the central aggregator. However, LDP results in loss of utility due to the amount of noise that is added to each individual data item. To address this issue, recent work introduced an intermediate server with the assumption that this intermediate server did not collude with the aggregator. Using this trust model, one can add less noise to achieve the same privacy guarantee; thus improving the utility. In this paper, we investigate this multiple-party setting of LDP. We first analyze the threat model and identify potential adversaries. We then make observations about existing approaches and propose new techniques that achieve a better privacy-utility tradeoff than existing ones. Finally, we perform experiments to compare different methods and demonstrate the benefits of using our proposed method.



There are no comments yet.


This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.