Bridging the KB-Text Gap: Leveraging Structured Knowledge-aware Pre-training for KBQA

08/28/2023
by   Guanting Dong, et al.
0

Knowledge Base Question Answering (KBQA) aims to answer natural language questions with factual information such as entities and relations in KBs. However, traditional Pre-trained Language Models (PLMs) are directly pre-trained on large-scale natural language corpus, which poses challenges for them in understanding and representing complex subgraphs in structured KBs. To bridge the gap between texts and structured KBs, we propose a Structured Knowledge-aware Pre-training method (SKP). In the pre-training stage, we introduce two novel structured knowledge-aware tasks, guiding the model to effectively learn the implicit relationship and better representations of complex subgraphs. In downstream KBQA task, we further design an efficient linearization strategy and an interval attention mechanism, which assist the model to better encode complex subgraphs and shield the interference of irrelevant subgraphs during reasoning respectively. Detailed experiments and analyses on WebQSP verify the effectiveness of SKP, especially the significant improvement in subgraph retrieval (+4.08

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2022

MVP: Multi-task Supervised Pre-training for Natural Language Generation

Pre-trained language models (PLMs) have achieved notable success in natu...
research
06/12/2023

The Effect of Masking Strategies on Knowledge Retention by Language Models

Language models retain a significant amount of world knowledge from thei...
research
05/17/2022

SKILL: Structured Knowledge Infusion for Large Language Models

Large language models (LLMs) have demonstrated human-level performance o...
research
06/01/2023

Explanation Graph Generation via Generative Pre-training over Synthetic Graphs

The generation of explanation graphs is a significant task that aims to ...
research
01/21/2023

Unifying Structure Reasoning and Language Model Pre-training for Complex Reasoning

Recent knowledge enhanced pre-trained language models have shown remarka...
research
11/13/2019

Unsupervised Pre-training for Natural Language Generation: A Literature Review

Recently, unsupervised pre-training is gaining increasing popularity in ...
research
04/17/2023

Effectiveness of Debiasing Techniques: An Indigenous Qualitative Analysis

An indigenous perspective on the effectiveness of debiasing techniques f...

Please sign up or login with your details

Forgot password? Click here to reset