Protect Edge Privacy in Path Publishing with Differential Privacy

01/07/2020
by   Zhigang Lu, et al.
0

Paths in a given network are a generalised form of time-serial chains in many real-world applications, such as trajectories and Internet flows. Differentially private trajectory publishing concerns publishing path information that is usable to the genuine users yet secure against adversaries to reconstruct the path with maximum background knowledge. The exiting studies all assume this knowledge to be all but one vertex on the path. To prevent the adversaries recovering the missing information, they publish a perturbed path where each vertex is sampled from a pre-defined set with differential privacy (DP) to replace the corresponding vertex in the original path. In this paper, we relax this assumption to be all but one edge on the path, and hence consider the scenario of more powerful adversaries with the maximum background knowledge of the entire network topology and the path (including all the vertices) except one (arbitrary) missing edge. Under such an assumption, the perturbed path produced by the existing work is vulnerable, because the adversary can reconstruct the missing edge from the existence of an edge in the perturbed path. To address this vulnerability and effectively protect edge privacy, instead of publishing a perturbed path, we propose a novel scheme of graph-based path publishing to protect the original path by embedding the path in a graph that contains fake edges and replicated vertices applying the differential privacy technique, such that only the legitimate users who have the full knowledge of the network topology are able to recover the exact vertices and edges of the original path with high probability. We theoretically analyse the performance of our algorithm in differential privacy, utility, and execution efficiency. We also conduct extensive experimental evaluations on a high-performance cluster system to validate our analytical results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2022

Towards Training Graph Neural Networks with Node-Level Differential Privacy

Graph Neural Networks (GNNs) have achieved great success in mining graph...
research
11/16/2020

Differential Privacy Meets Maximum-weight Matching

When it comes to large-scale multi-agent systems with a diverse set of a...
research
04/29/2022

Breaking the Linear Error Barrier in Differentially Private Graph Distance Release

Releasing all pairwise shortest path (APSP) distances between vertices o...
research
02/03/2020

Differentially Private k-Means Clustering with Guaranteed Convergence

Iterative clustering algorithms help us to learn the insights behind the...
research
08/04/2021

Real-World Trajectory Sharing with Local Differential Privacy

Sharing trajectories is beneficial for many real-world applications, suc...
research
02/21/2022

Degree-Preserving Randomized Response for Graph Neural Networks under Local Differential Privacy

Differentially private GNNs (Graph Neural Networks) have been recently s...
research
02/13/2023

LDPTrace: Locally Differentially Private Trajectory Synthesis

Trajectory data has the potential to greatly benefit a wide-range of rea...

Please sign up or login with your details

Forgot password? Click here to reset