Differentially Private Multivariate Statistics with an Application to Contingency Table Analysis

by   Jonghyeok Lee, et al.

Differential privacy (DP) has become a rigorous central concept in privacy protection for the past decade. Among various notions of DP, f-DP is an easily interpretable and informative concept that tightly captures privacy level by comparing trade-off functions obtained from the hypothetical test of how well the mechanism recognizes individual information in the dataset. We adopt the Gaussian differential privacy (GDP), a canonical parametric family of f-DP. The Gaussian mechanism is a natural and fundamental mechanism that tightly achieves GDP. However, the ordinary multivariate Gaussian mechanism is not optimal with respect to statistical utility. To improve the utility, we develop the rank-deficient and James-Stein Gaussian mechanisms for releasing private multivariate statistics based on the geometry of multivariate Gaussian distribution. We show that our proposals satisfy GDP and dominate the ordinary Gaussian mechanism with respect to L_2-cost. We also show that the Laplace mechanism, a prime mechanism in ε-DP framework, is sub-optimal than Gaussian-type mechanisms under the framework of GDP. For a fair comparison, we calibrate the Laplace mechanism to the global sensitivity of the statistic with the exact approach to the trade-off function. We also develop the optimal parameter for the Laplace mechanism when applied to contingency tables. Indeed, we show that the Gaussian-type mechanisms dominate the Laplace mechanism in contingency table analysis. In addition, we apply our findings to propose differentially private χ^2-tests on contingency tables. Numerical results demonstrate that differentially private parametric bootstrap tests control the type I error rates and show higher power than other natural competitors.


DPpack: An R Package for Differentially Private Statistical Analysis and Machine Learning

Differential privacy (DP) is the state-of-the-art framework for guarante...

Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms

A common way to protect privacy of sensitive information is to introduce...

A Statistical Threshold for Adversarial Classification in Laplace Mechanisms

This paper studies the statistical characterization of detecting an adve...

Elliptical Perturbations for Differential Privacy

We study elliptical distributions in locally convex vector spaces, and d...

Differential Privacy for Binary Functions via Randomized Graph Colorings

We present a framework for designing differentially private (DP) mechani...

Differentially Private Uniformly Most Powerful Tests for Binomial Data

We derive uniformly most powerful (UMP) tests for simple and one-sided h...

Are We There Yet? Timing and Floating-Point Attacks on Differential Privacy Systems

Differential privacy is a de facto privacy framework that has seen adopt...

Please sign up or login with your details

Forgot password? Click here to reset