Differentially Private Decoupled Graph Convolutions for Multigranular Topology Protection

07/12/2023
by   Eli Chien, et al.
0

Graph learning methods, such as Graph Neural Networks (GNNs) based on graph convolutions, are highly successful in solving real-world learning problems involving graph-structured data. However, graph learning methods expose sensitive user information and interactions not only through their model parameters but also through their model predictions. Consequently, standard Differential Privacy (DP) techniques that merely offer model weight privacy are inadequate. This is especially the case for node predictions that leverage neighboring node attributes directly via graph convolutions that create additional risks of privacy leakage. To address this problem, we introduce Graph Differential Privacy (GDP), a new formal DP framework tailored to graph learning settings that ensures both provably private model parameters and predictions. Furthermore, since there may be different privacy requirements for the node attributes and graph structure, we introduce a novel notion of relaxed node-level data adjacency. This relaxation can be used for establishing guarantees for different degrees of graph topology privacy while maintaining node attribute privacy. Importantly, this relaxation reveals a useful trade-off between utility and topology privacy for graph learning methods. In addition, our analysis of GDP reveals that existing DP-GNNs fail to exploit this trade-off due to the complex interplay between graph topology and attribute data in standard graph convolution designs. To mitigate this problem, we introduce the Differentially Private Decoupled Graph Convolution (DPDGC) model, which benefits from decoupled graph convolution while providing GDP guarantees. Extensive experiments on seven node classification benchmarking datasets demonstrate the superior privacy-utility trade-off of DPDGC over existing DP-GNNs based on standard graph convolution design.

READ FULL TEXT

page 2

page 5

research
10/10/2022

Towards Training Graph Neural Networks with Node-Level Differential Privacy

Graph Neural Networks (GNNs) have achieved great success in mining graph...
research
07/25/2023

Node Injection Link Stealing Attack

In this paper, we present a stealthy and effective attack that exposes p...
research
07/04/2022

A Customised Text Privatisation Mechanism with Differential Privacy

In Natural Language Understanding (NLU) applications, training an effect...
research
03/17/2022

SoK: Differential Privacy on Graph-Structured Data

In this work, we study the applications of differential privacy (DP) in ...
research
11/23/2021

Node-Level Differentially Private Graph Neural Networks

Graph Neural Networks (GNNs) are a popular technique for modelling graph...
research
07/13/2023

Privacy-Utility Trade-offs in Neural Networks for Medical Population Graphs: Insights from Differential Privacy and Graph Structure

We initiate an empirical investigation into differentially private graph...
research
06/18/2022

Certified Graph Unlearning

Graph-structured data is ubiquitous in practice and often processed usin...

Please sign up or login with your details

Forgot password? Click here to reset