Towards Arbitrarily Expressive GNNs in O(n^2) Space by Rethinking Folklore Weisfeiler-Lehman

06/05/2023
by   Jiarui Feng, et al.
0

Message passing neural networks (MPNNs) have emerged as the most popular framework of graph neural networks (GNNs) in recent years. However, their expressive power is limited by the 1-dimensional Weisfeiler-Lehman (1-WL) test. Some works are inspired by k-WL/FWL (Folklore WL) and design the corresponding neural versions. Despite the high expressive power, there are serious limitations in this line of research. In particular, (1) k-WL/FWL requires at least O(n^k) space complexity, which is impractical for large graphs even when k=3; (2) The design space of k-WL/FWL is rigid, with the only adjustable hyper-parameter being k. To tackle the first limitation, we propose an extension, (k, t)-FWL. We theoretically prove that even if we fix the space complexity to O(n^2) in (k, t)-FWL, we can construct an expressiveness hierarchy up to solving the graph isomorphism problem. To tackle the second problem, we propose k-FWL+, which considers any equivariant set as neighbors instead of all nodes, thereby greatly expanding the design space of k-FWL. Combining these two modifications results in a flexible and powerful framework (k, t)-FWL+. We demonstrate (k, t)-FWL+ can implement most existing models with matching expressiveness. We then introduce an instance of (k,t)-FWL+ called Neighborhood^2-FWL (N^2-FWL), which is practically and theoretically sound. We prove that N^2-FWL is no less powerful than 3-WL, can encode many substructures while only requiring O(n^2) space. Finally, we design its neural version named N^2-GNN and evaluate its performance on various tasks. N^2-GNN achieves superior performance on almost all tasks, with record-breaking results on ZINC-Subset (0.059) and ZINC-Full (0.013), outperforming previous state-of-the-art results by 10.6 respectively.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset