Local-to-global Perspectives on Graph Neural Networks

06/11/2023
by   Chen Cai, et al.
0

This thesis presents a local-to-global perspective on graph neural networks (GNN), the leading architecture to process graph-structured data. After categorizing GNN into local Message Passing Neural Networks (MPNN) and global Graph transformers, we present three pieces of work: 1) study the convergence property of a type of global GNN, Invariant Graph Networks, 2) connect the local MPNN and global Graph Transformer, and 3) use local MPNN for graph coarsening, a standard subroutine used in global modeling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2022

Extending Graph Transformers with Quantum Computed Aggregation

Recently, efforts have been made in the community to design new Graph Ne...
research
08/04/2020

Graph Neural Networks with Low-rank Learnable Local Filters

For the classification of graph data consisting of features sampled on a...
research
07/06/2022

Pure Transformers are Powerful Graph Learners

We show that standard Transformers without graph-specific modifications ...
research
07/16/2020

Natural Graph Networks

Conventional neural message passing algorithms are invariant under permu...
research
03/01/2021

A Biased Graph Neural Network Sampler with Near-Optimal Regret

Graph neural networks (GNN) have recently emerged as a vehicle for apply...
research
03/23/2021

Health Status Prediction with Local-Global Heterogeneous Behavior Graph

Health management is getting increasing attention all over the world. Ho...
research
04/30/2022

Graph Anisotropic Diffusion

Traditional Graph Neural Networks (GNNs) rely on message passing, which ...

Please sign up or login with your details

Forgot password? Click here to reset