Scaling up graph homomorphism for classification via sampling

04/08/2021
by   Paul Beaujean, et al.
0

Feature generation is an open topic of investigation in graph machine learning. In this paper, we study the use of graph homomorphism density features as a scalable alternative to homomorphism numbers which retain similar theoretical properties and ability to take into account inductive bias. For this, we propose a high-performance implementation of a simple sampling algorithm which computes additive approximations of homomorphism densities. In the context of graph machine learning, we demonstrate in experiments that simple linear models trained on sample homomorphism densities can achieve performance comparable to graph neural networks on standard graph classification datasets. Finally, we show in experiments on synthetic data that this algorithm scales to very large graphs when implemented with Bloom filters.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/23/2019

Revisiting Graph Neural Networks: All We Have is Low-Pass Filters

Graph neural networks have become one of the most important techniques t...
10/24/2017

Classification on Large Networks: A Quantitative Bound via Motifs and Graphons

When each data point is a large graph, graph statistics such as densitie...
06/09/2021

Scaling Up Graph Neural Networks Via Graph Coarsening

Scalability of graph neural networks remains one of the major challenges...
10/16/2020

Generalizable Machine Learning in Neuroscience using Graph Neural Networks

Although a number of studies have explored deep learning in neuroscience...
02/08/2022

Simplified Graph Convolution with Heterophily

Graph convolutional networks (GCNs) (Kipf Welling, 2017) attempt to ...
11/03/2020

Sampling and Recovery of Graph Signals based on Graph Neural Networks

We propose interpretable graph neural networks for sampling and recovery...
07/11/2018

Morse Code Datasets for Machine Learning

We present an algorithm to generate synthetic datasets of tunable diffic...