Pathfinder Discovery Networks for Neural Message Passing

10/24/2020
by   Benedek Rozemberczki, et al.
0

In this work we propose Pathfinder Discovery Networks (PDNs), a method for jointly learning a message passing graph over a multiplex network with a downstream semi-supervised model. PDNs inductively learn an aggregated weight for each edge, optimized to produce the best outcome for the downstream learning task. PDNs are a generalization of attention mechanisms on graphs which allow flexible construction of similarity functions between nodes, edge convolutions, and cheap multiscale mixing layers. We show that PDNs overcome weaknesses of existing methods for graph attention (e.g. Graph Attention Networks), such as the diminishing weight problem. Our experimental results demonstrate competitive predictive performance on academic node classification tasks. Additional results from a challenging suite of node classification experiments show how PDNs can learn a wider class of functions than existing baselines. We analyze the relative computational complexity of PDNs, and show that PDN runtime is not considerably higher than static-graph models. Finally, we discuss how PDNs can be used to construct an easily interpretable attention mechanism that allows users to understand information propagation in the graph.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

03/02/2021

Multi-Level Attention Pooling for Graph Neural Networks: Unifying Graph Representations with Multiple Localities

Graph neural networks (GNNs) have been widely used to learn vector repre...
02/05/2021

Graph Joint Attention Networks

Graph attention networks (GATs) have been recognized as powerful tools f...
05/09/2021

Dispatcher: A Message-Passing Approach To Language Modelling

This paper proposes a message-passing mechanism to address language mode...
12/16/2021

Graph Convolutional Networks with Dual Message Passing for Subgraph Isomorphism Counting and Matching

Graph neural networks (GNNs) and message passing neural networks (MPNNs)...
03/04/2020

PushNet: Efficient and Adaptive Neural Message Passing

Message passing neural networks have recently evolved into a state-of-th...
10/06/2021

Geometric and Physical Quantities improve E(3) Equivariant Message Passing

Including covariant information, such as position, force, velocity or sp...
11/01/2018

Modeling Attention Flow on Graphs

Real-world scenarios demand reasoning about process, more than final out...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.