Distances Release with Differential Privacy in Tree and Grid Graph

04/26/2022
by   Chenglin Fan, et al.
0

Data about individuals may contain private and sensitive information. The differential privacy (DP) was proposed to address the problem of protecting the privacy of each individual while keeping useful information about a population. Sealfon (2016) introduced a private graph model in which the graph topology is assumed to be public while the weight information is assumed to be private. That model can express hidden congestion patterns in a known transportation system. In this paper, we revisit the problem of privately releasing approximate distances between all pairs of vertices in (Sealfon 2016). Our goal is to minimize the additive error, namely the difference between the released distance and actual distance under private setting. We propose improved solutions to that problem for several cases. For the problem of privately releasing all-pairs distances, we show that for tree with depth h, we can release all-pairs distances with additive error O(log^1.5 h ·log^1.5 V) for fixed privacy parameter where V the number of vertices in the tree, which improves the previous error bound O(log^2.5 V), since the size of h can be as small as O(log V). Our result implies that a log V factor is saved, and the additive error in tree can be smaller than the error on array/path. Additionally, for the grid graph with arbitrary edge weights, we also propose a method to release all-pairs distances with additive error Õ(V^3/4) for fixed privacy parameters. On the application side, many cities like Manhattan are composed of horizontal streets and vertical avenues, which can be modeled as a grid graph.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset