Structure and Sensitivity in Differential Privacy: Comparing K-Norm Mechanisms

01/28/2018
by   Jordan Awan, et al.
0

A common way to protect privacy of sensitive information is to introduce additional randomness, beyond sampling. Differential Privacy (DP) provides a rigorous framework for quantifying privacy risk of such procedures which allow for data summary releases, such as a statistic T. However in theory and practice, the structure of the statistic T is often not carefully analyzed, resulting in inefficient implementation of DP mechanisms that introduce excessive randomness to minimize the risk, reducing the statistical utility of the result. We introduce the adjacent output space S_T of T, and connect S_T to the notion of sensitivity, which controls the amount of randomness required to protect privacy. Using S_T, we formalize the comparison of K-Norm Mechanisms and derive the optimal one as a function of the adjacent output space. We use these methods to extend the Objective Perturbation and the Functional mechanisms to arbitrary K-Mechanisms, and apply them to Logistic and Linear Regression, respectively, to allow for differentially private releases of statistical results. We compare the performance through simulations, and on a housing price data. Our results demonstrate that the choice of mechanism impacts the utility of the output, and our proposed methodology offers a significant improvement in utility for the same level of risk.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset