Contraction of Locally Differentially Private Mechanisms

10/24/2022
by   Shahab Asoodeh, et al.
0

We investigate the contraction properties of locally differentially private mechanisms. More specifically, we derive tight upper bounds on the divergence between PK and QK output distributions of an ϵ-LDP mechanism K in terms of a divergence between the corresponding input distributions P and Q, respectively. Our first main technical result presents a sharp upper bound on the χ^2-divergence χ^2(PKQK) in terms of χ^2(PQ) and ϵ. We also show that the same result holds for a large family of divergences, including KL-divergence and squared Hellinger distance. The second main technical result gives an upper bound on χ^2(PKQK) in terms of total variation distance TV(P, Q) and ϵ. We then utilize these bounds to establish locally private versions of the Cramér-Rao bound, Le Cam's, Assouad's, and the mutual information methods, which are powerful tools for bounding minimax estimation risks. These results are shown to lead to better privacy analyses than the state-of-the-arts in several statistical problems such as entropy and discrete distribution estimation, non-parametric density estimation, and hypothesis testing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2017

Estimating Mixture Entropy with Pairwise Distances

Mixture distributions arise in many parametric and non-parametric settin...
research
03/01/2022

Tight bounds for augmented KL divergence in terms of augmented total variation distance

We provide optimal variational upper and lower bounds for the augmented ...
research
02/02/2021

Local Differential Privacy Is Equivalent to Contraction of E_γ-Divergence

We investigate the local differential privacy (LDP) guarantees of a rand...
research
10/27/2020

Faster Differentially Private Samplers via Rényi Divergence Analysis of Discretized Langevin MCMC

Various differentially private algorithms instantiate the exponential me...
research
08/07/2018

Test without Trust: Optimal Locally Private Distribution Testing

We study the problem of distribution testing when the samples can only b...
research
11/07/2021

Sampling from Log-Concave Distributions with Infinity-Distance Guarantees and Applications to Differentially Private Optimization

For a d-dimensional log-concave distribution π(θ)∝ e^-f(θ) on a polytope...
research
11/29/2018

Locally Differentially-Private Randomized Response for Discrete Distribution Learning

We consider a setup in which confidential i.i.d. samples X_1,,X_n from a...

Please sign up or login with your details

Forgot password? Click here to reset