Pathfinder Discovery Networks for Neural Message Passing

by   Benedek Rozemberczki, et al.

In this work we propose Pathfinder Discovery Networks (PDNs), a method for jointly learning a message passing graph over a multiplex network with a downstream semi-supervised model. PDNs inductively learn an aggregated weight for each edge, optimized to produce the best outcome for the downstream learning task. PDNs are a generalization of attention mechanisms on graphs which allow flexible construction of similarity functions between nodes, edge convolutions, and cheap multiscale mixing layers. We show that PDNs overcome weaknesses of existing methods for graph attention (e.g. Graph Attention Networks), such as the diminishing weight problem. Our experimental results demonstrate competitive predictive performance on academic node classification tasks. Additional results from a challenging suite of node classification experiments show how PDNs can learn a wider class of functions than existing baselines. We analyze the relative computational complexity of PDNs, and show that PDN runtime is not considerably higher than static-graph models. Finally, we discuss how PDNs can be used to construct an easily interpretable attention mechanism that allows users to understand information propagation in the graph.



page 1

page 2

page 3

page 4


Multi-Level Attention Pooling for Graph Neural Networks: Unifying Graph Representations with Multiple Localities

Graph neural networks (GNNs) have been widely used to learn vector repre...

Graph Joint Attention Networks

Graph attention networks (GATs) have been recognized as powerful tools f...

Dispatcher: A Message-Passing Approach To Language Modelling

This paper proposes a message-passing mechanism to address language mode...

Graph Convolutional Networks with Dual Message Passing for Subgraph Isomorphism Counting and Matching

Graph neural networks (GNNs) and message passing neural networks (MPNNs)...

PushNet: Efficient and Adaptive Neural Message Passing

Message passing neural networks have recently evolved into a state-of-th...

Geometric and Physical Quantities improve E(3) Equivariant Message Passing

Including covariant information, such as position, force, velocity or sp...

Modeling Attention Flow on Graphs

Real-world scenarios demand reasoning about process, more than final out...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.