Graph Information Bottleneck

10/24/2020
by   Tailin Wu, et al.
20

Representation learning of graph-structured data is challenging because both graph structure and node features carry important information. Graph Neural Networks (GNNs) provide an expressive way to fuse information from network structure and node features. However, GNNs are prone to adversarial attacks. Here we introduce Graph Information Bottleneck (GIB), an information-theoretic principle that optimally balances expressiveness and robustness of the learned representation of graph-structured data. Inheriting from the general Information Bottleneck (IB), GIB aims to learn the minimal sufficient representation for a given task by maximizing the mutual information between the representation and the target, and simultaneously constraining the mutual information between the representation and the input data. Different from the general IB, GIB regularizes the structural as well as the feature information. We design two sampling algorithms for structural regularization and instantiate the GIB principle with two new models: GIB-Cat and GIB-Bern, and demonstrate the benefits by evaluating the resilience to adversarial attacks. We show that our proposed models are more robust than state-of-the-art graph defense models. GIB-based models empirically achieve up to 31 perturbation of the graph structure as well as node features.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/21/2022

Robust Unsupervised Graph Representation Learning via Mutual Information Maximization

Recent studies have shown that GNNs are vulnerable to adversarial attack...
01/14/2022

Compact Graph Structure Learning via Mutual Information Compression

Graph Structure Learning (GSL) recently has attracted considerable atten...
04/27/2021

Structure-Aware Hierarchical Graph Pooling using Information Bottleneck

Graph pooling is an essential ingredient of Graph Neural Networks (GNNs)...
03/13/2020

What Information Does a ResNet Compress?

The information bottleneck principle (Shwartz-Ziv Tishby, 2017) sugg...
12/16/2021

Graph Structure Learning with Variational Information Bottleneck

Graph Neural Networks (GNNs) have shown promising results on a broad spe...
06/13/2019

Utilizing Edge Features in Graph Neural Networks via Variational Information Maximization

Graph Neural Networks (GNNs) achieve an impressive performance on struct...
10/05/2020

InfoBERT: Improving Robustness of Language Models from An Information Theoretic Perspective

Large-scale language models such as BERT have achieved state-of-the-art ...

Code Repositories

GIB

Graph Information Bottleneck (GIB) for learning minimal sufficient structural and feature information using GNNs


view repo
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.