Calculating Question Similarity is Enough:A New Method for KBQA Tasks

11/15/2021
by   Hanyu Zhao, et al.
0

Knowledge Base Question Answering (KBQA) aims to answer natural language questions with the help of an external knowledge base. The core idea is to find the link between the internal knowledge behind questions and known triples of the knowledge base. The KBQA task pipeline contains several steps, including entity recognition, relationship extraction, and entity linking. This kind of pipeline method means that errors in any procedure will inevitably propagate to the final prediction. In order to solve the above problem, this paper proposes a Corpus Generation - Retrieve Method (CGRM) with Pre-training Language Model (PLM) and Knowledge Graph (KG). Firstly, based on the mT5 model, we designed two new pre-training tasks: knowledge masked language modeling and question generation based on the paragraph to obtain the knowledge enhanced T5 (kT5) model. Secondly, after preprocessing triples of knowledge graph with a series of heuristic rules, the kT5 model generates natural language QA pairs based on processed triples. Finally, we directly solve the QA by retrieving the synthetic dataset. We test our method on NLPCC-ICCPOL 2016 KBQA dataset, and the results show that our framework improves the performance of KBQA and the out straight-forward method is competitive with the state-of-the-art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2019

Knowledge Guided Text Retrieval and Reading for Open Domain Question Answering

This paper presents a general approach for open-domain question answerin...
research
03/12/2018

Entity-Aware Language Model as an Unsupervised Reranker

In language modeling, it is difficult to incorporate entity relationship...
research
09/12/2021

Knowledge Enhanced Fine-Tuning for Better Handling Unseen Entities in Dialogue Generation

Although pre-training models have achieved great success in dialogue gen...
research
09/07/2022

Knowledge-enhanced Iterative Instruction Generation and Reasoning for Knowledge Base Question Answering

Multi-hop Knowledge Base Question Answering(KBQA) aims to find the answe...
research
06/18/2021

SPBERT: An Efficient Pre-training BERT on SPARQL Queries for Question Answering over Knowledge Graphs

In this paper, we propose SPBERT, a transformer-based language model pre...
research
02/08/2020

HHH: An Online Medical Chatbot System based on Knowledge Graph and Hierarchical Bi-Directional Attention

This paper proposes a chatbot framework that adopts a hybrid model which...
research
05/23/2023

Mitigating Language Model Hallucination with Interactive Question-Knowledge Alignment

Despite the remarkable recent advances in language models, they still st...

Please sign up or login with your details

Forgot password? Click here to reset