DeepAI AI Chat
Log In Sign Up

Information-theoretic metrics for Local Differential Privacy protocols

Local Differential Privacy (LDP) protocols allow an aggregator to obtain population statistics about sensitive data of a userbase, while protecting the privacy of the individual users. To understand the tradeoff between aggregator utility and user privacy, we introduce new information-theoretic metrics for utility and privacy. Contrary to other LDP metrics, these metrics highlight the fact that the users and the aggregator are interested in fundamentally different domains of information. We show how our metrics relate to ε-LDP, the de facto standard privacy metric, giving an information-theoretic interpretation to the latter. Furthermore, we use our metrics to quantitatively study the privacy-utility tradeoff for a number of popular protocols.


page 1

page 2

page 3

page 4


The Privacy Funnel from the viewpoint of Local Differential Privacy

We consider a database X⃗ = (X_1,...,X_n) containing the data of n users...

Information-Theoretic Approaches to Differential Privacy

The tutorial studies relationships between differential privacy and vari...

On the Relationship Between Information-Theoretic Privacy Metrics And Probabilistic Information Privacy

Information-theoretic (IT) measures based on f-divergences have recently...

Context-Aware Local Differential Privacy

Local differential privacy (LDP) is a strong notion of privacy for indiv...

Estimating Numerical Distributions under Local Differential Privacy

When collecting information, local differential privacy (LDP) relieves t...

Privacy-Utility Trades in Crowdsourced Signal Map Obfuscation

Cellular providers and data aggregating companies crowdsource celluar si...

Privacy-Preserving Distributed Processing: Metrics, Bounds, and Algorithms

Privacy-preserving distributed processing has recently attracted conside...