GAP: Differentially Private Graph Neural Networks with Aggregation Perturbation

03/02/2022
by   Sina Sajadmanesh, et al.
5

Graph Neural Networks (GNNs) are powerful models designed for graph data that learn node representation by recursively aggregating information from each node's local neighborhood. However, despite their state-of-the-art performance in predictive graph-based applications, recent studies have shown that GNNs can raise significant privacy concerns when graph data contain sensitive information. As a result, in this paper, we study the problem of learning GNNs with Differential Privacy (DP). We propose GAP, a novel differentially private GNN that safeguards the privacy of nodes and edges using aggregation perturbation, i.e., adding calibrated stochastic noise to the output of the GNN's aggregation function, which statistically obfuscates the presence of a single edge (edge-level privacy) or a single node and all its adjacent edges (node-level privacy). To circumvent the accumulation of privacy cost at every forward pass of the model, we tailor the GNN architecture to the specifics of private learning. In particular, we first precompute private aggregations by recursively applying neighborhood aggregation and perturbing the output of each aggregation step. Then, we privately train a deep neural network on the resulting perturbed aggregations for any node-wise classification task. A major advantage of GAP over previous approaches is that we guarantee edge-level and node-level DP not only for training, but also at inference time with no additional costs beyond the training's privacy budget. We theoretically analyze the formal privacy guarantees of GAP using Rényi DP. Empirical experiments conducted over three real-world graph datasets demonstrate that GAP achieves a favorable privacy-accuracy trade-off and significantly outperforms existing approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/09/2023

Differentially Private Graph Neural Network with Importance-Grained Noise Adaption

Graph Neural Networks (GNNs) with differential privacy have been propose...
research
11/23/2021

Node-Level Differentially Private Graph Neural Networks

Graph Neural Networks (GNNs) are a popular technique for modelling graph...
research
08/14/2021

LinkTeller: Recovering Private Edges from Graph Neural Networks via Influence Analysis

Graph structured data have enabled several successful applications such ...
research
03/17/2022

SoK: Differential Privacy on Graph-Structured Data

In this work, we study the applications of differential privacy (DP) in ...
research
07/14/2022

Differentially Private Graph Learning via Sensitivity-Bounded Personalized PageRank

Personalized PageRank (PPR) is a fundamental tool in unsupervised learni...
research
01/27/2023

SplitGNN: Splitting GNN for Node Classification with Heterogeneous Attention

With the frequent happening of privacy leakage and the enactment of priv...
research
06/05/2021

GraphMI: Extracting Private Graph Data from Graph Neural Networks

As machine learning becomes more widely used for critical applications, ...

Please sign up or login with your details

Forgot password? Click here to reset