Fast Linear Model for Knowledge Graph Embeddings

10/30/2017
by   Armand Joulin, et al.
0

This paper shows that a simple baseline based on a Bag-of-Words (BoW) representation learns surprisingly good knowledge graph embeddings. By casting knowledge base completion and question answering as supervised classification problems, we observe that modeling co-occurences of entities and relations leads to state-of-the-art performance with a training time of a few minutes using the open sourced library fastText.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/21/2018

Expeditious Generation of Knowledge Graph Embeddings

Knowledge Graph Embedding methods aim at representing entities and relat...
research
04/26/2019

Towards Data Poisoning Attack against Knowledge Graph Embedding

Knowledge graph embedding (KGE) is a technique for learning continuous e...
research
12/18/2019

Uncovering Relations for Marketing Knowledge Representation

Online behaviors of consumers and marketers generate massive marketing d...
research
11/03/2022

Embedding Knowledge Graph of Patent Metadata to Measure Knowledge Proximity

Knowledge proximity refers to the strength of association between any tw...
research
08/13/2018

Modeling Semantics with Gated Graph Neural Networks for Knowledge Base Question Answering

The most approaches to Knowledge Base Question Answering are based on se...
research
09/17/2023

Model-based Subsampling for Knowledge Graph Completion

Subsampling is effective in Knowledge Graph Embedding (KGE) for reducing...
research
06/28/2017

Learning Knowledge Graph Embeddings with Type Regularizer

Learning relations based on evidence from knowledge bases relies on proc...

Please sign up or login with your details

Forgot password? Click here to reset