Improved Differentially Private Euclidean Distance Approximation

03/22/2022
by   Nina Mesing Stausholm, et al.
0

This work shows how to privately and more accurately estimate Euclidean distance between pairs of vectors. Input vectors x and y are mapped to differentially private sketches x' and y', from which one can estimate the distance between x and y. Our estimator relies on the Sparser Johnson-Lindenstrauss constructions by Kane & Nelson (Journal of the ACM 2014), which for any 0<α,β<1/2 have optimal output dimension k=Θ(α^-2log(1/β)) and sparsity s=O(α^-1log(1/β)). We combine the constructions of Kane & Nelson with either the Laplace or the Gaussian mechanism from the differential privacy literature, depending on the privacy parameters ε and δ. We also suggest a differentially private version of Fast Johnson-Lindenstrauss Transform (FJLT) by Ailon & Chazelle (SIAM Journal of Computing 2009) which offers a tradeoff in speed for variance for certain parameters. We answer an open question by Kenthapadi et al. (Journal of Privacy and Confidentiality 2013) by analyzing the privacy and utility guarantees of an estimator for Euclidean distance, relying on Laplacian rather than Gaussian noise. We prove that the Laplace mechanism yields lower variance than the Gaussian mechanism whenever δ<β^O(1/α). Thus, our work poses an improvement over the work of Kenthapadi et al. by giving a more efficient estimator with lower variance for sufficiently small δ. Our sketch also achieves pure differential privacy as a neat side-effect of the Laplace mechanism rather than the approximate differential privacy guarantee of the Gaussian mechanism, which may not be sufficiently strong for some settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2022

Differentially Private Maximal Information Coefficients

The Maximal Information Coefficient (MIC) is a powerful statistic to ide...
research
06/18/2021

Differentially private sparse vectors with low error, optimal space, and fast access

Representing a sparse histogram, or more generally a sparse vector, is a...
research
03/15/2021

A Central Limit Theorem for Differentially Private Query Answering

Perhaps the single most important use case for differential privacy is t...
research
06/24/2018

On The Differential Privacy of Thompson Sampling With Gaussian Prior

We show that Thompson Sampling with Gaussian Prior as detailed by Algori...
research
10/31/2020

Strongly universally consistent nonparametric regression and classification with privatised data

In this paper we revisit the classical problem of nonparametric regressi...
research
06/15/2022

Brownian Noise Reduction: Maximizing Privacy Subject to Accuracy Constraints

There is a disconnect between how researchers and practitioners handle p...
research
08/08/2022

Differentially Private Fréchet Mean on the Manifold of Symmetric Positive Definite (SPD) Matrices

Differential privacy has become crucial in the real-world deployment of ...

Please sign up or login with your details

Forgot password? Click here to reset