Differentially Private Graph Classification with GNNs

02/05/2022
by   Tamara T. Mueller, et al.
0

Graph Neural Networks (GNNs) have established themselves as the state-of-the-art models for many machine learning applications such as the analysis of social networks, protein interactions and molecules. Several among these datasets contain privacy-sensitive data. Machine learning with differential privacy is a promising technique to allow deriving insight from sensitive data while offering formal guarantees of privacy protection. However, the differentially private training of GNNs has so far remained under-explored due to the challenges presented by the intrinsic structural connectivity of graphs. In this work, we introduce differential privacy for graph-level classification, one of the key applications of machine learning on graphs. Our method is applicable to deep learning on multi-graph datasets and relies on differentially private stochastic gradient descent (DP-SGD). We show results on a variety of synthetic and public datasets and evaluate the impact of different GNN architectures and training hyperparameters on model performance for differentially private graph classification. Finally, we apply explainability techniques to assess whether similar representations are learned in the private and non-private settings and establish robust baselines for future work in this area.

READ FULL TEXT
research
04/18/2023

ProGAP: Progressive Graph Neural Networks with Differential Privacy Guarantees

Graph Neural Networks (GNNs) have become a popular tool for learning on ...
research
03/17/2022

SoK: Differential Privacy on Graph-Structured Data

In this work, we study the applications of differential privacy (DP) in ...
research
01/02/2023

Training Differentially Private Graph Neural Networks with Random Walk Sampling

Deep learning models are known to put the privacy of their training data...
research
08/31/2019

Publishing Community-Preserving Attributed Social Graphs with a Differential Privacy Guarantee

We present a novel method for publishing differentially private syntheti...
research
06/15/2022

Disparate Impact in Differential Privacy from Gradient Misalignment

As machine learning becomes more widespread throughout society, aspects ...
research
07/19/2023

DP-TBART: A Transformer-based Autoregressive Model for Differentially Private Tabular Data Generation

The generation of synthetic tabular data that preserves differential pri...
research
01/29/2019

Representation Transfer for Differentially Private Drug Sensitivity Prediction

Motivation: Human genomic datasets often contain sensitive information t...

Please sign up or login with your details

Forgot password? Click here to reset