ProGAP: Progressive Graph Neural Networks with Differential Privacy Guarantees

04/18/2023
by   Sina Sajadmanesh, et al.
3

Graph Neural Networks (GNNs) have become a popular tool for learning on graphs, but their widespread use raises privacy concerns as graph data can contain personal or sensitive information. Differentially private GNN models have been recently proposed to preserve privacy while still allowing for effective learning over graph-structured datasets. However, achieving an ideal balance between accuracy and privacy in GNNs remains challenging due to the intrinsic structural connectivity of graphs. In this paper, we propose a new differentially private GNN called ProGAP that uses a progressive training scheme to improve such accuracy-privacy trade-offs. Combined with the aggregation perturbation technique to ensure differential privacy, ProGAP splits a GNN into a sequence of overlapping submodels that are trained progressively, expanding from the first submodel to the complete model. Specifically, each submodel is trained over the privately aggregated node embeddings learned and cached by the previous submodels, leading to an increased expressive power compared to previous approaches while limiting the incurred privacy costs. We formally prove that ProGAP ensures edge-level and node-level privacy guarantees for both training and inference stages, and evaluate its performance on benchmark graph datasets. Experimental results demonstrate that ProGAP can achieve up to 5 state-of-the-art differentially private GNNs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/10/2022

Towards Training Graph Neural Networks with Node-Level Differential Privacy

Graph Neural Networks (GNNs) have achieved great success in mining graph...
research
02/05/2022

Differentially Private Graph Classification with GNNs

Graph Neural Networks (GNNs) have established themselves as the state-of...
research
09/18/2021

Releasing Graph Neural Networks with Differential Privacy Guarantees

With the increasing popularity of Graph Neural Networks (GNNs) in severa...
research
02/21/2022

Degree-Preserving Randomized Response for Graph Neural Networks under Local Differential Privacy

Differentially private GNNs (Graph Neural Networks) have been recently s...
research
07/13/2023

Privacy-Utility Trade-offs in Neural Networks for Medical Population Graphs: Insights from Differential Privacy and Graph Structure

We initiate an empirical investigation into differentially private graph...
research
01/02/2023

Training Differentially Private Graph Neural Networks with Random Walk Sampling

Deep learning models are known to put the privacy of their training data...
research
03/17/2022

SoK: Differential Privacy on Graph-Structured Data

In this work, we study the applications of differential privacy (DP) in ...

Please sign up or login with your details

Forgot password? Click here to reset