A Factorization Machine Framework for Testing Bigram Embeddings in Knowledgebase Completion

04/20/2016
by   Johannes Welbl, et al.
0

Embedding-based Knowledge Base Completion models have so far mostly combined distributed representations of individual entities or relations to compute truth scores of missing links. Facts can however also be represented using pairwise embeddings, i.e. embeddings for pairs of entities and relations. In this paper we explore such bigram embeddings with a flexible Factorization Machine model and several ablations from it. We investigate the relevance of various bigram types on the fb15k237 dataset and find relative improvements compared to a compositional model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2021

Scalable knowledge base completion with superposition memories

We present Harmonic Memory Networks (HMem), a neural architecture for kn...
research
11/02/2018

Augmenting Compositional Models for Knowledge Base Completion Using Gradient Representations

Neural models of Knowledge Base data have typically employed composition...
research
04/29/2010

Isometric Embeddings in Imaging and Vision: Facts and Fiction

We explore the practicability of Nash's Embedding Theorem in vision and ...
research
05/24/2018

Interpretable and Compositional Relation Learning by Joint Training with an Autoencoder

Embedding models for entities and relations are extremely useful for rec...
research
08/25/2018

Data-dependent Learning of Symmetric/Antisymmetric Relations for Knowledge Base Completion

Embedding-based methods for knowledge base completion (KBC) learn repres...
research
07/13/2020

BoxE: A Box Embedding Model for Knowledge Base Completion

Knowledge base completion (KBC) aims to automatically infer missing fact...
research
05/10/2015

Probabilistic Belief Embedding for Knowledge Base Completion

This paper contributes a novel embedding model which measures the probab...

Please sign up or login with your details

Forgot password? Click here to reset