A Hybrid Model for Learning Embeddings and Logical Rules Simultaneously from Knowledge Graphs

09/22/2020
by   Susheel Suresh, et al.
0

The problem of knowledge graph (KG) reasoning has been widely explored by traditional rule-based systems and more recently by knowledge graph embedding methods. While logical rules can capture deterministic behavior in a KG they are brittle and mining ones that infer facts beyond the known KG is challenging. Probabilistic embedding methods are effective in capturing global soft statistical tendencies and reasoning with them is computationally efficient. While embedding representations learned from rich training data are expressive, incompleteness and sparsity in real-world KGs can impact their effectiveness. We aim to leverage the complementary properties of both methods to develop a hybrid model that learns both high-quality rules and embeddings simultaneously. Our method uses a cross feedback paradigm wherein, an embedding model is used to guide the search of a rule mining system to mine rules and infer new facts. These new facts are sampled and further used to refine the embedding model. Experiments on multiple benchmark datasets show the effectiveness of our method over other competitive standalone and hybrid baselines. We also show its efficacy in a sparse KG setting and finally explore the connection with negative sampling.

READ FULL TEXT

Authors

page 1

06/20/2019

Probabilistic Logic Neural Networks for Reasoning

Knowledge graph reasoning, which aims at predicting the missing facts th...
03/21/2019

Iteratively Learning Embeddings and Rules for Knowledge Graph Reasoning

Reasoning is essential for the development of large knowledge graphs, es...
12/18/2021

DegreEmbed: incorporating entity embedding into logic rule learning for knowledge graph reasoning

Knowledge graphs (KGs), as structured representations of real world fact...
01/07/2022

Stay Positive: Knowledge Graph Embedding Without Negative Sampling

Knowledge graphs (KGs) are typically incomplete and we often wish to inf...
06/03/2020

IterefinE: Iterative KG Refinement Embeddings using Symbolic Knowledge

Knowledge Graphs (KGs) extracted from text sources are often noisy and l...
07/13/2020

BoxE: A Box Embedding Model for Knowledge Base Completion

Knowledge base completion (KBC) aims to automatically infer missing fact...
09/06/2018

Logical Rule Induction and Theory Learning Using Neural Theorem Proving

A hallmark of human cognition is the ability to continually acquire and ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.