Utility Preserving Secure Private Data Release

01/28/2019
by   Jasjeet Dhaliwal, et al.
0

Differential privacy mechanisms that also make reconstruction of the data impossible come at a cost - a decrease in utility. In this paper, we tackle this problem by designing a private data release mechanism that makes reconstruction of the original data impossible and also preserves utility for a wide range of machine learning algorithms. We do so by combining the Johnson-Lindenstrauss (JL) transform with noise generated from a Laplace distribution. While the JL transform can itself provide privacy guarantees blocki2012johnson and make reconstruction impossible, we do not rely on its differential privacy properties and only utilize its ability to make reconstruction impossible. We present novel proofs to show that our mechanism is differentially private under single element changes as well as single row changes to any database. In order to show utility, we prove that our mechanism maintains pairwise distances between points in expectation and also show that its variance is proportional to the the dimensionality of the subspace we project the data into. Finally, we experimentally show the utility of our mechanism by deploying it on the task of clustering.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset