Edge Contraction Pooling for Graph Neural Networks

05/27/2019
by   Frederik Diehl, et al.
0

Graph Neural Network (GNN) research has concentrated on improving convolutional layers, with little attention paid to developing graph pooling layers. Yet pooling layers can enable GNNs to reason over abstracted groups of nodes instead of single nodes. To close this gap, we propose a graph pooling layer relying on the notion of edge contraction: EdgePool learns a localized and sparse hard pooling transform. We show that EdgePool outperforms alternative pooling methods, can be easily integrated into most GNN models, and improves performance on both node and graph classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2019

Mincut pooling in Graph Neural Networks

The advance of node pooling operations in a Graph Neural Network (GNN) h...
research
07/22/2020

Graph Neural Networks with Haar Transform-Based Convolution and Pooling: A Complete Guide

Graph Neural Networks (GNNs) have recently caught great attention and ac...
research
05/15/2019

Function Space Pooling For Graph Convolutional Networks

Convolutional layers in graph neural networks are a fundamental type of ...
research
04/10/2021

Pyramidal Reservoir Graph Neural Network

We propose a deep Graph Neural Network (GNN) model that alternates two t...
research
02/26/2023

Path Integral Based Convolution and Pooling for Heterogeneous Graph Neural Networks

Graph neural networks (GNN) extends deep learning to graph-structure dat...
research
10/05/2022

Bi-Stride Multi-Scale Graph Neural Network for Mesh-Based Physical Simulation

Learning physical systems on unstructured meshes by flat Graph neural ne...
research
04/28/2022

DOTIN: Dropping Task-Irrelevant Nodes for GNNs

Scalability is an important consideration for deep graph neural networks...

Please sign up or login with your details

Forgot password? Click here to reset