-
Predicting Biomedical Interactions with Higher-Order Graph Convolutional Networks
Biomedical interaction networks have incredible potential to be useful i...
read it
-
Hybrid Low-order and Higher-order Graph Convolutional Networks
With higher-order neighborhood information of graph network, the accurac...
read it
-
Higher-order Weighted Graph Convolutional Networks
Graph Convolution Network (GCN) has been recognized as one of the most e...
read it
-
Attention-based Graph Neural Network for Semi-supervised Learning
Recently popularized graph neural networks achieve the state-of-the-art ...
read it
-
Nonlinear Higher-Order Label Spreading
Label spreading is a general technique for semi-supervised learning with...
read it
-
Patch-level Neighborhood Interpolation: A General and Effective Graph-based Regularization Strategy
Regularization plays a crucial role in machine learning models, especial...
read it
-
Perfect sampling from spatial mixing
We show that strong spatial mixing with a rate faster than the growth of...
read it
MixHop: Higher-Order Graph Convolution Architectures via Sparsified Neighborhood Mixing
Existing popular methods for semi-supervised learning with Graph Neural Networks (such as the Graph Convolutional Network) provably cannot learn a general class of neighborhood mixing relationships. To address this weakness, we propose a new model, MixHop, that can learn these relationships, including difference operators, by repeatedly mixing feature representations of neighbors at various distances. MixHop requires no additional memory or computational complexity, and outperforms on challenging baselines. In addition, we propose sparsity regularization that allows us to visualize how the network prioritizes neighborhood information across different graph datasets. Our analysis of the learned architectures reveals that neighborhood mixing varies per datasets.
READ FULL TEXT
Comments
There are no comments yet.