Local Differential Privacy Is Equivalent to Contraction of E_γ-Divergence

by   Shahab Asoodeh, et al.

We investigate the local differential privacy (LDP) guarantees of a randomized privacy mechanism via its contraction properties. We first show that LDP constraints can be equivalently cast in terms of the contraction coefficient of the E_γ-divergence. We then use this equivalent formula to express LDP guarantees of privacy mechanisms in terms of contraction coefficients of arbitrary f-divergences. When combined with standard estimation-theoretic tools (such as Le Cam's and Fano's converse methods), this result allows us to study the trade-off between privacy and utility in several testing and minimax and Bayesian estimation problems.



page 1

page 2

page 3

page 4


Privacy Analysis of Online Learning Algorithms via Contraction Coefficients

We propose an information-theoretic technique for analyzing privacy guar...

Privacy Amplification of Iterative Algorithms via Contraction Coefficients

We investigate the framework of privacy amplification by iteration, rece...

Observational Equivalence in System Estimation: Contractions in Complex Networks

Observability of complex systems/networks is the focus of this paper, wh...

Minimax Rates of Estimating Approximate Differential Privacy

Differential privacy has become a widely accepted notion of privacy, lea...

Successive Refinement of Privacy

This work examines a novel question: how much randomness is needed to ac...

Connecting Randomized Response, Post-Randomization, Differential Privacy and t-Closeness via Deniability and Permutation

We explore some novel connections between the main privacy models in use...

Local Distribution Obfuscation via Probability Coupling

We introduce a general model for the local obfuscation of probability di...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.