Composing Knowledge Graph Embeddings via Word Embeddings

09/09/2019
by   Lianbo Ma, et al.
0

Learning knowledge graph embedding from an existing knowledge graph is very important to knowledge graph completion. For a fact (h,r,t) with the head entity h having a relation r with the tail entity t, the current approaches aim to learn low dimensional representations (h,r,t), each of which corresponds to the elements in (h, r, t), respectively. As (h,r,t) is learned from the existing facts within a knowledge graph, these representations can not be used to detect unknown facts (if the entities or relations never occur in the knowledge graph). This paper proposes a new approach called TransW, aiming to go beyond the current work by composing knowledge graph embeddings using word embeddings. Given the fact that an entity or a relation contains one or more words (quite often), it is sensible to learn a mapping function from word embedding spaces to knowledge embedding spaces, which shows how entities are constructed using human words. More importantly, composing knowledge embeddings using word embeddings makes it possible to deal with the emerging new facts (either new entities or relations). Experimental results using three public datasets show the consistency and outperformance of the proposed TransW.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2020

QuatRE: Relation-Aware Quaternions for Knowledge Graph Embeddings

We propose a simple and effective embedding model, named QuatRE, to lear...
research
02/01/2022

Towards a Theoretical Understanding of Word and Relation Representation

Representing words by vectors, or embeddings, enables computational reas...
research
01/25/2021

RelWalk A Latent Variable Model Approach to Knowledge Graph Embedding

Embedding entities and relations of a knowledge graph in a low-dimension...
research
05/06/2020

Graph-Embedding Empowered Entity Retrieval

In this research, we improve upon the current state of the art in entity...
research
04/17/2020

Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings

“Classical” word embeddings, such as Word2Vec, have been shown to captur...
research
09/24/2019

Assessing the Lexico-Semantic Relational Knowledge Captured by Word and Concept Embeddings

Deep learning currently dominates the benchmarks for various NLP tasks a...
research
04/09/2021

Probabilistic Box Embeddings for Uncertain Knowledge Graph Reasoning

Knowledge bases often consist of facts which are harvested from a variet...

Please sign up or login with your details

Forgot password? Click here to reset