Scaling up graph homomorphism for classification via sampling

by   Paul Beaujean, et al.

Feature generation is an open topic of investigation in graph machine learning. In this paper, we study the use of graph homomorphism density features as a scalable alternative to homomorphism numbers which retain similar theoretical properties and ability to take into account inductive bias. For this, we propose a high-performance implementation of a simple sampling algorithm which computes additive approximations of homomorphism densities. In the context of graph machine learning, we demonstrate in experiments that simple linear models trained on sample homomorphism densities can achieve performance comparable to graph neural networks on standard graph classification datasets. Finally, we show in experiments on synthetic data that this algorithm scales to very large graphs when implemented with Bloom filters.


page 1

page 2

page 3

page 4


Revisiting Graph Neural Networks: All We Have is Low-Pass Filters

Graph neural networks have become one of the most important techniques t...

Classification on Large Networks: A Quantitative Bound via Motifs and Graphons

When each data point is a large graph, graph statistics such as densitie...

Scaling Up Graph Neural Networks Via Graph Coarsening

Scalability of graph neural networks remains one of the major challenges...

Generalizable Machine Learning in Neuroscience using Graph Neural Networks

Although a number of studies have explored deep learning in neuroscience...

Simplified Graph Convolution with Heterophily

Graph convolutional networks (GCNs) (Kipf Welling, 2017) attempt to ...

Sampling and Recovery of Graph Signals based on Graph Neural Networks

We propose interpretable graph neural networks for sampling and recovery...

Morse Code Datasets for Machine Learning

We present an algorithm to generate synthetic datasets of tunable diffic...