DeepAI AI Chat
Log In Sign Up

Information-theoretic metrics for Local Differential Privacy protocols

Local Differential Privacy (LDP) protocols allow an aggregator to obtain population statistics about sensitive data of a userbase, while protecting the privacy of the individual users. To understand the tradeoff between aggregator utility and user privacy, we introduce new information-theoretic metrics for utility and privacy. Contrary to other LDP metrics, these metrics highlight the fact that the users and the aggregator are interested in fundamentally different domains of information. We show how our metrics relate to ε-LDP, the de facto standard privacy metric, giving an information-theoretic interpretation to the latter. Furthermore, we use our metrics to quantitatively study the privacy-utility tradeoff for a number of popular protocols.

READ FULL TEXT

page 1

page 2

page 3

page 4

02/04/2020

The Privacy Funnel from the viewpoint of Local Differential Privacy

We consider a database X⃗ = (X_1,...,X_n) containing the data of n users...
03/22/2022

Information-Theoretic Approaches to Differential Privacy

The tutorial studies relationships between differential privacy and vari...
01/20/2023

On the Relationship Between Information-Theoretic Privacy Metrics And Probabilistic Information Privacy

Information-theoretic (IT) measures based on f-divergences have recently...
10/31/2019

Context-Aware Local Differential Privacy

Local differential privacy (LDP) is a strong notion of privacy for indiv...
12/02/2019

Estimating Numerical Distributions under Local Differential Privacy

When collecting information, local differential privacy (LDP) relieves t...
01/13/2022

Privacy-Utility Trades in Crowdsourced Signal Map Obfuscation

Cellular providers and data aggregating companies crowdsource celluar si...
09/02/2020

Privacy-Preserving Distributed Processing: Metrics, Bounds, and Algorithms

Privacy-preserving distributed processing has recently attracted conside...