Heterogeneous Randomized Response for Differential Privacy in Graph Neural Networks

11/10/2022
by   Khang Tran, et al.
0

Graph neural networks (GNNs) are susceptible to privacy inference attacks (PIAs), given their ability to learn joint representation from features and edges among nodes in graph data. To prevent privacy leakages in GNNs, we propose a novel heterogeneous randomized response (HeteroRR) mechanism to protect nodes' features and edges against PIAs under differential privacy (DP) guarantees without an undue cost of data and model utility in training GNNs. Our idea is to balance the importance and sensitivity of nodes' features and edges in redistributing the privacy budgets since some features and edges are more sensitive or important to the model utility than others. As a result, we derive significantly better randomization probabilities and tighter error bounds at both levels of nodes' features and edges departing from existing approaches, thus enabling us to maintain high data utility for training GNNs. An extensive theoretical and empirical analysis using benchmark datasets shows that HeteroRR significantly outperforms various baselines in terms of model utility under rigorous privacy protection for both nodes' features and edges. That enables us to defend PIAs in DP-preserving GNNs effectively.

READ FULL TEXT

page 11

page 12

research
02/21/2022

Degree-Preserving Randomized Response for Graph Neural Networks under Local Differential Privacy

Differentially private GNNs (Graph Neural Networks) have been recently s...
research
05/06/2022

LPGNet: Link Private Graph Networks for Node Classification

Classification tasks on labeled graph-structured data have many importan...
research
09/06/2023

Blink: Link Local Differential Privacy in Graph Neural Networks via Bayesian Estimation

Graph neural networks (GNNs) have gained an increasing amount of popular...
research
06/22/2021

NetFense: Adversarial Defenses against Privacy Attacks on Neural Networks for Graph Data

Recent advances in protecting node privacy on graph data and attacking g...
research
09/15/2023

Evaluating the Impact of Local Differential Privacy on Utility Loss via Influence Functions

How to properly set the privacy parameter in differential privacy (DP) h...
research
09/15/2023

Local Differential Privacy in Graph Neural Networks: a Reconstruction Approach

Graph Neural Networks have achieved tremendous success in modeling compl...
research
08/14/2021

LinkTeller: Recovering Private Edges from Graph Neural Networks via Influence Analysis

Graph structured data have enabled several successful applications such ...

Please sign up or login with your details

Forgot password? Click here to reset