Improving Knowledge Graph Representation Learning by Structure Contextual Pre-training

12/08/2021
by   Ganqiang Ye, et al.
1

Representation learning models for Knowledge Graphs (KG) have proven to be effective in encoding structural information and performing reasoning over KGs. In this paper, we propose a novel pre-training-then-fine-tuning framework for knowledge graph representation learning, in which a KG model is firstly pre-trained with triple classification task, followed by discriminative fine-tuning on specific downstream tasks such as entity type prediction and entity alignment. Drawing on the general ideas of learning deep contextualized word representations in typical pre-trained language models, we propose SCoP to learn pre-trained KG representations with structural and contextual triples of the target triple encoded. Experimental results demonstrate that fine-tuning SCoP not only outperforms results of baselines on a portfolio of downstream tasks but also avoids tedious task-specific model design and parameter training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2023

SGL-PT: A Strong Graph Learner with Graph Prompt Tuning

Recently, much exertion has been paid to design graph self-supervised me...
research
09/14/2023

A Data Source for Reasoning Embodied Agents

Recent progress in using machine learning models for reasoning tasks has...
research
04/28/2022

Process-BERT: A Framework for Representation Learning on Educational Process Data

Educational process data, i.e., logs of detailed student activities in c...
research
02/28/2019

Efficient Contextual Representation Learning Without Softmax Layer

Contextual representation models have achieved great success in improvin...
research
08/22/2022

Repurposing Knowledge Graph Embeddings for Triple Representation via Weak Supervision

The majority of knowledge graph embedding techniques treat entities and ...
research
06/17/2019

Exploiting Unsupervised Pre-training and Automated Feature Engineering for Low-resource Hate Speech Detection in Polish

This paper presents our contribution to PolEval 2019 Task 6: Hate speech...
research
06/26/2020

TURL: Table Understanding through Representation Learning

Relational tables on the Web store a vast amount of knowledge. Owing to ...

Please sign up or login with your details

Forgot password? Click here to reset