A Hybrid Model for Learning Embeddings and Logical Rules Simultaneously from Knowledge Graphs

by   Susheel Suresh, et al.

The problem of knowledge graph (KG) reasoning has been widely explored by traditional rule-based systems and more recently by knowledge graph embedding methods. While logical rules can capture deterministic behavior in a KG they are brittle and mining ones that infer facts beyond the known KG is challenging. Probabilistic embedding methods are effective in capturing global soft statistical tendencies and reasoning with them is computationally efficient. While embedding representations learned from rich training data are expressive, incompleteness and sparsity in real-world KGs can impact their effectiveness. We aim to leverage the complementary properties of both methods to develop a hybrid model that learns both high-quality rules and embeddings simultaneously. Our method uses a cross feedback paradigm wherein, an embedding model is used to guide the search of a rule mining system to mine rules and infer new facts. These new facts are sampled and further used to refine the embedding model. Experiments on multiple benchmark datasets show the effectiveness of our method over other competitive standalone and hybrid baselines. We also show its efficacy in a sparse KG setting and finally explore the connection with negative sampling.



page 1


Probabilistic Logic Neural Networks for Reasoning

Knowledge graph reasoning, which aims at predicting the missing facts th...

Iteratively Learning Embeddings and Rules for Knowledge Graph Reasoning

Reasoning is essential for the development of large knowledge graphs, es...

DegreEmbed: incorporating entity embedding into logic rule learning for knowledge graph reasoning

Knowledge graphs (KGs), as structured representations of real world fact...

Stay Positive: Knowledge Graph Embedding Without Negative Sampling

Knowledge graphs (KGs) are typically incomplete and we often wish to inf...

IterefinE: Iterative KG Refinement Embeddings using Symbolic Knowledge

Knowledge Graphs (KGs) extracted from text sources are often noisy and l...

BoxE: A Box Embedding Model for Knowledge Base Completion

Knowledge base completion (KBC) aims to automatically infer missing fact...

Logical Rule Induction and Theory Learning Using Neural Theorem Proving

A hallmark of human cognition is the ability to continually acquire and ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.