Mincut pooling in Graph Neural Networks

06/30/2019
by   Filippo Maria Bianchi, et al.
0

The advance of node pooling operations in a Graph Neural Network (GNN) has lagged behind the feverish design of new graph convolution techniques, and pooling remains an important and challenging endeavor for the design of deep architectures. In this paper, we propose a pooling operation for GNNs that implements a differentiable unsupervised loss based on the mincut optimization objective. First, we validate the effectiveness of the proposed loss function by clustering nodes in citation networks and through visualization examples, such as image segmentation. Then, we show how the proposed pooling layer can be used to build a deep GNN architecture for graph classification.

READ FULL TEXT
research
05/27/2019

Edge Contraction Pooling for Graph Neural Networks

Graph Neural Network (GNN) research has concentrated on improving convol...
research
05/24/2022

High-Order Pooling for Graph Neural Networks with Tensor Decomposition

Graph Neural Networks (GNNs) are attracting growing attention due to the...
research
04/10/2021

Pyramidal Reservoir Graph Neural Network

We propose a deep Graph Neural Network (GNN) model that alternates two t...
research
08/11/2020

PiNet: Attention Pooling for Graph Classification

We propose PiNet, a generalised differentiable attention-based pooling m...
research
10/22/2020

Rethinking pooling in graph neural networks

Graph pooling is a central component of a myriad of graph neural network...
research
10/03/2019

Graph Analysis and Graph Pooling in the Spatial Domain

The spatial convolution layer which is widely used in the Graph Neural N...
research
09/21/2021

Graph Neural Networks for Graph Drawing

Graph Drawing techniques have been developed in the last few years with ...

Please sign up or login with your details

Forgot password? Click here to reset