Dynamically Pruned Message Passing Networks for Large-Scale Knowledge Graph Reasoning

by   Xiaoran Xu, et al.

We propose Dynamically Pruned Message Passing Networks (DPMPN) for large-scale knowledge graph reasoning. In contrast to existing models, embedding-based or path-based, we learn an input-dependent subgraph to explicitly model a sequential reasoning process. Each subgraph is dynamically constructed, expanding itself selectively under a flow-style attention mechanism. In this way, we can not only construct graphical explanations to interpret prediction, but also prune message passing in Graph Neural Networks (GNNs) to scale with the size of graphs. We take the inspiration from the consciousness prior proposed by Bengio to design a two-GNN framework to encode global input-invariant graph-structured representation and learn local input-dependent one coordinated by an attention module. Experiments show the reasoning capability in our model that is providing a clear graphical explanation as well as predicting results accurately, outperforming most state-of-the-art methods in knowledge base completion tasks.


River of No Return: Graph Percolation Embeddings for Efficient Knowledge Graph Reasoning

We study Graph Neural Networks (GNNs)-based embedding techniques for kno...

Neural Consciousness Flow

The ability of reasoning beyond data fitting is substantial to deep lear...

Modeling Attention Flow on Graphs

Real-world scenarios demand reasoning about process, more than final out...

Attentional Multilabel Learning over Graphs - A message passing approach

We address a largely open problem of multilabel classification over grap...

Persistent Message Passing

Graph neural networks (GNNs) are a powerful inductive bias for modelling...

Provably Powerful Graph Neural Networks for Directed Multigraphs

This paper proposes a set of simple adaptations to transform standard me...

Enhancing Dialogue Generation via Dynamic Graph Knowledge Aggregation

Incorporating external graph knowledge into neural chatbot models has be...

Please sign up or login with your details

Forgot password? Click here to reset