Knowledge Graph Completion with Pre-trained Multimodal Transformer and Twins Negative Sampling

09/15/2022
by   Yichi Zhang, et al.
0

Knowledge graphs (KGs) that modelings the world knowledge as structural triples are inevitably incomplete. Such problems still exist for multimodal knowledge graphs (MMKGs). Thus, knowledge graph completion (KGC) is of great importance to predict the missing triples in the existing KGs. As for the existing KGC methods, embedding-based methods rely on manual design to leverage multimodal information while finetune-based approaches are not superior to embedding-based methods in link prediction. To address these problems, we propose a VisualBERT-enhanced Knowledge Graph Completion model (VBKGC for short). VBKGC could capture deeply fused multimodal information for entities and integrate them into the KGC model. Besides, we achieve the co-design of the KGC model and negative sampling by designing a new negative sampling strategy called twins negative sampling. Twins negative sampling is suitable for multimodal scenarios and could align different embeddings for entities. We conduct extensive experiments to show the outstanding performance of VBKGC on the link prediction task and make further exploration of VBKGC.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2017

Analysis of the Impact of Negative Sampling on Link Prediction in Knowledge Graphs

Knowledge graphs are large, useful, but incomplete knowledge repositorie...
research
07/29/2022

KG-NSF: Knowledge Graph Completion with a Negative-Sample-Free Approach

Knowledge Graph (KG) completion is an important task that greatly benefi...
research
02/25/2022

CAKE: A Scalable Commonsense-Aware Framework For Multi-View Knowledge Graph Completion

Knowledge graphs store a large number of factual triples while they are ...
research
05/04/2022

Hybrid Transformer with Multi-level Fusion for Multimodal Knowledge Graph Completion

Multimodal Knowledge Graphs (MKGs), which organize visual-text factual k...
research
05/25/2023

How to Turn Your Knowledge Graph Embeddings into Generative Models via Probabilistic Circuits

Some of the most successful knowledge graph embedding (KGE) models for l...
research
10/01/2022

Multimodal Analogical Reasoning over Knowledge Graphs

Analogical reasoning is fundamental to human cognition and holds an impo...
research
03/09/2022

Language Model-driven Negative Sampling

Knowledge Graph Embeddings (KGEs) encode the entities and relations of a...

Please sign up or login with your details

Forgot password? Click here to reset