
XAI for Graphs: Explaining Graph Neural Network Predictions by Identifying Relevant Walks
Graph Neural Networks (GNNs) are a popular approach for predicting graph...
read it

An Analysis of Attentive WalkAggregating Graph Neural Networks
Graph neural networks (GNNs) have been shown to possess strong represent...
read it

Weisfeiler and Leman Go Neural: Higherorder Graph Neural Networks
In recent years, graph neural networks (GNNs) have emerged as a powerful...
read it

A Practical Guide to Graph Neural Networks
Graph neural networks (GNNs) have recently grown in popularity in the fi...
read it

Benchmarking Graph Neural Networks
Graph neural networks (GNNs) have become the standard toolkit for analyz...
read it

Learning Graph Normalization for Graph Neural Networks
Graph Neural Networks (GNNs) have attracted considerable attention and h...
read it

A Novel Genetic Algorithm with Hierarchical Evaluation Strategy for Hyperparameter Optimisation of Graph Neural Networks
Graph representation of structured data can facilitate the extraction of...
read it
Ranking Structured Objects with Graph Neural Networks
Graph neural networks (GNNs) have been successfully applied in many structured data domains, with applications ranging from molecular property prediction to the analysis of social networks. Motivated by the broad applicability of GNNs, we propose the family of socalled RankGNNs, a combination of neural Learning to Rank (LtR) methods and GNNs. RankGNNs are trained with a set of pairwise preferences between graphs, suggesting that one of them is preferred over the other. One practical application of this problem is drug screening, where an expert wants to find the most promising molecules in a large collection of drug candidates. We empirically demonstrate that our proposed pairwise RankGNN approach either significantly outperforms or at least matches the ranking performance of the naive pointwise baseline approach, in which the LtR problem is solved via GNNbased graph regression.
READ FULL TEXT
Comments
There are no comments yet.