Graph Information Bottleneck

by   Tailin Wu, et al.

Representation learning of graph-structured data is challenging because both graph structure and node features carry important information. Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features. However, GNNs are prone to adversarial attacks. Here we introduce Graph Information Bottleneck (GIB), an information-theoretic principle that optimally balances expressiveness and robustness of the learned representation of graph-structured data. Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task by maximizing the mutual information between the representation and the target, and simultaneously constraining the mutual information between the representation and the input data. Different from the general IB, GIB regularizes the structural as well as the feature information. We design two sampling algorithms for structural regularization and instantiate the GIB principle with two new models: GIB-Cat and GIB-Bern, and demonstrate the benefits by evaluating the resilience to adversarial attacks. We show that our proposed models are more robust than state-of-the-art graph defense models. GIB-based models empirically achieve up to 31 perturbation of the graph structure as well as node features.



There are no comments yet.


page 1

page 2

page 3

page 4


Robust Unsupervised Graph Representation Learning via Mutual Information Maximization

Recent studies have shown that GNNs are vulnerable to adversarial attack...

Compact Graph Structure Learning via Mutual Information Compression

Graph Structure Learning (GSL) recently has attracted considerable atten...

Structure-Aware Hierarchical Graph Pooling using Information Bottleneck

Graph pooling is an essential ingredient of Graph Neural Networks (GNNs)...

What Information Does a ResNet Compress?

The information bottleneck principle (Shwartz-Ziv Tishby, 2017) sugg...

Graph Structure Learning with Variational Information Bottleneck

Graph Neural Networks (GNNs) have shown promising results on a broad spe...

Utilizing Edge Features in Graph Neural Networks via Variational Information Maximization

Graph Neural Networks (GNNs) achieve an impressive performance on struct...

InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective

Large-scale language models such as BERT have achieved state-of-the-art ...

Code Repositories


Graph Information Bottleneck (GIB) for learning minimal sufficient structural and feature information using GNNs

view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.