The Limits of Differential Privacy (and its Misuse in Data Release and Machine Learning)

11/04/2020
by   Josep Domingo-Ferrer, et al.
0

Differential privacy (DP) is a neat privacy definition that can co-exist with certain well-defined data uses in the context of interactive queries. However, DP is neither a silver bullet for all privacy problems nor a replacement for all previous privacy models. In fact, extreme care should be exercised when trying to extend its use beyond the setting it was designed for. This paper reviews the limitations of DP and its misuse for individual data collection, individual data release, and machine learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2022

Auditing Differential Privacy in High Dimensions with the Kernel Quantum Rényi Divergence

Differential privacy (DP) is the de facto standard for private data rele...
research
05/22/2023

Differential Privacy with Random Projections and Sign Random Projections

In this paper, we develop a series of differential privacy (DP) algorith...
research
11/02/2019

Relations among different privacy notions

We present a comprehensive view of the relations among several privacy n...
research
11/02/2020

Budget Sharing for Multi-Analyst Differential Privacy

Large organizations that collect data about populations (like the US Cen...
research
02/23/2023

Don't Look at the Data! How Differential Privacy Reconfigures the Practices of Data Science

Across academia, government, and industry, data stewards are facing incr...
research
03/22/2022

Privacy: An axiomatic approach

The increasing prevalence of large-scale data collection in modern socie...

Please sign up or login with your details

Forgot password? Click here to reset