Efficiency in local differential privacy

01/25/2023
by   Lukas Steinberger, et al.
0

We develop a theory of asymptotic efficiency in regular parametric models when data confidentiality is ensured by local differential privacy (LDP). Even though efficient parameter estimation is a classical and well-studied problem in mathematical statistics, it leads to several non-trivial obstacles that need to be tackled when dealing with the LDP case. Starting from a standard parametric model 𝒫=(P_θ)_θ∈Θ, Θ⊆ℝ^p, for the iid unobserved sensitive data X_1,…, X_n, we establish local asymptotic mixed normality (along subsequences) of the model Q^(n)𝒫=(Q^(n)P_θ^n)_θ∈Θ generating the sanitized observations Z_1,…, Z_n, where Q^(n) is an arbitrary sequence of sequentially interactive privacy mechanisms. This result readily implies convolution and local asymptotic minimax theorems. In case p=1, the optimal asymptotic variance is found to be the inverse of the supremal Fisher-Information sup_Q∈𝒬_α I_θ(Q𝒫)∈ℝ, where the supremum runs over all α-differentially private (marginal) Markov kernels. We present an algorithm for finding a (nearly) optimal privacy mechanism Q̂ and an estimator θ̂_n(Z_1,…, Z_n) based on the corresponding sanitized data that achieves this asymptotically optimal variance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro