Integrating Knowledge Graph embedding and pretrained Language Models in Hypercomplex Spaces

08/04/2022
by   Mojtaba Nayyeri, et al.
15

Knowledge Graphs, such as Wikidata, comprise structural and textual knowledge in order to represent knowledge. For each of the two modalities dedicated approaches for graph embedding and language models learn patterns that allow for predicting novel structural knowledge. Few approaches have integrated learning and inference with both modalities and these existing ones could only partially exploit the interaction of structural and textual knowledge. In our approach, we build on existing strong representations of single modalities and we use hypercomplex algebra to represent both, (i), single-modality embedding as well as, (ii), the interaction between different modalities and their complementary means of knowledge representation. More specifically, we suggest Dihedron and Quaternion representations of 4D hypercomplex numbers to integrate four modalities namely structural knowledge graph embedding, word-level representations (e.g. Word2vec, Fasttext), sentence-level representations (Sentence transformer), and document-level representations (sentence transformer, Doc2vec). Our unified vector representation scores the plausibility of labelled edges via Hamilton and Dihedron products, thus modeling pairwise interactions between different modalities. Extensive experimental evaluation on standard benchmark datasets shows the superiority of our two new models using abundant textual information besides sparse structural knowledge to enhance performance in link prediction tasks.

READ FULL TEXT

page 2

page 4

page 11

research
11/26/2016

Knowledge Graph Representation with Jointly Structural and Textual Encoding

The objective of knowledge graph embedding is to encode both entities an...
research
04/11/2021

Edge: Enriching Knowledge Graph Embeddings with External Text

Knowledge graphs suffer from sparsity which degrades the quality of repr...
research
06/25/2022

Language Models as Knowledge Embeddings

Knowledge embeddings (KE) represent a knowledge graph (KG) by embedding ...
research
11/07/2019

Explicit Pairwise Word Interaction Modeling Improves Pretrained Transformers for English Semantic Similarity Tasks

In English semantic similarity tasks, classic word embedding-based appro...
research
04/29/2022

KERMIT – A Transformer-Based Approach for Knowledge Graph Matching

One of the strongest signals for automated matching of knowledge graphs ...
research
07/13/2022

The DLCC Node Classification Benchmark for Analyzing Knowledge Graph Embeddings

Knowledge graph embedding is a representation learning technique that pr...
research
09/01/2019

Higher-order Comparisons of Sentence Encoder Representations

Representational Similarity Analysis (RSA) is a technique developed by n...

Please sign up or login with your details

Forgot password? Click here to reset