Building Memory with Concept Learning Capabilities from Large-scale Knowledge Base

12/03/2015
by   Jiaxin Shi, et al.
0

We present a new perspective on neural knowledge base (KB) embeddings, from which we build a framework that can model symbolic knowledge in the KB together with its learning process. We show that this framework well regularizes previous neural KB embedding model for superior performance in reasoning tasks, while having the capabilities of dealing with unseen entities, that is, to learn their embeddings from natural language descriptions, which is very like human's behavior of learning semantic concepts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2022

Improving Entity Disambiguation by Reasoning over a Knowledge Base

Recent work in entity disambiguation (ED) has typically neglected struct...
research
09/21/2020

Visual-Semantic Embedding Model Informed by Structured Knowledge

We propose a novel approach to improve a visual-semantic embedding model...
research
12/10/2022

MAPS-KB: A Million-scale Probabilistic Simile Knowledge Base

The ability to understand and generate similes is an imperative step to ...
research
02/14/2020

Scalable Neural Methods for Reasoning With a Symbolic Knowledge Base

We describe a novel way of representing a symbolic knowledge base (KB) c...
research
08/21/2023

Deciphering Raw Data in Neuro-Symbolic Learning with Provable Guarantees

Neuro-symbolic hybrid systems are promising for integrating machine lear...
research
01/25/2021

A Simple Disaster-Related Knowledge Base for Intelligent Agents

In this paper, we describe our efforts in establishing a simple knowledg...
research
11/15/2018

Combining Axiom Injection and Knowledge Base Completion for Efficient Natural Language Inference

In logic-based approaches to reasoning tasks such as Recognizing Textual...

Please sign up or login with your details

Forgot password? Click here to reset