End-to-End Differentiable Proving

05/31/2017
by   Tim Rocktäschel, et al.
0

We introduce neural networks for end-to-end differentiable proving of queries to knowledge bases by operating on dense vector representations of symbols. These neural networks are constructed recursively by taking inspiration from the backward chaining algorithm as used in Prolog. Specifically, we replace symbolic unification with a differentiable computation on vector representations of symbols using a radial basis function kernel, thereby combining symbolic reasoning with learning subsymbolic vector representations. By using gradient descent, the resulting neural network can be trained to infer facts from a given incomplete knowledge base. It learns to (i) place representations of similar symbols in close proximity in a vector space, (ii) make use of such similarities to prove queries, (iii) induce logical rules, and (iv) use provided and induced logical rules for multi-hop reasoning. We demonstrate that this architecture outperforms ComplEx, a state-of-the-art neural link prediction model, on three out of four benchmark knowledge bases while at the same time inducing interpretable function-free first-order logic rules.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2017

Differentiable Learning of Logical Rules for Knowledge Base Reasoning

We study the problem of learning probabilistic first-order logical rules...
research
04/28/2022

Learning First-Order Rules with Differentiable Logic Program Semantics

Learning first-order logic programs (LPs) from relational facts which yi...
research
08/24/2022

Deep Symbolic Learning: Discovering Symbols and Rules from Perceptions

Neuro-Symbolic (NeSy) integration combines symbolic reasoning with Neura...
research
10/07/2020

Abductive Knowledge Induction From Raw Data

For many reasoning-heavy tasks, it is challenging to find an appropriate...
research
09/06/2018

Logical Rule Induction and Theory Learning Using Neural Theorem Proving

A hallmark of human cognition is the ability to continually acquire and ...
research
06/26/2023

logLTN: Differentiable Fuzzy Logic in the Logarithm Space

The AI community is increasingly focused on merging logic with deep lear...
research
05/15/2019

Neural Query Language: A Knowledge Base Query Language for Tensorflow

Large knowledge bases (KBs) are useful for many AI tasks, but are diffic...

Please sign up or login with your details

Forgot password? Click here to reset