DeepAI
Log In Sign Up

SPBERT: An Efficient Pre-training BERT on SPARQL Queries for Question Answering over Knowledge Graphs

06/18/2021
by   Hieu Tran, et al.
0

In this paper, we propose SPBERT, a transformer-based language model pre-trained on massive SPARQL query logs. By incorporating masked language modeling objectives and the word structural objective, SPBERT can learn general-purpose representations in both natural language and SPARQL query language. We investigate how SPBERT and encoder-decoder architecture can be adapted for Knowledge-based QA corpora. We conduct exhaustive experiments on two additional tasks, including SPARQL Query Construction and Answer Verbalization Generation. The experimental results show that SPBERT can obtain promising results, achieving state-of-the-art BLEU scores on several of these tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/14/2020

Utilizing Bidirectional Encoder Representations from Transformers for Answer Selection

Pre-training a transformer-based model for the language modeling task in...
05/08/2019

Unified Language Model Pre-training for Natural Language Understanding and Generation

This paper presents a new Unified pre-trained Language Model (UniLM) tha...
08/13/2019

StructBERT: Incorporating Language Structures into Pre-training for Deep Language Understanding

Recently, the pre-trained language model, BERT (Devlin et al.(2018)Devli...
11/15/2021

Calculating Question Similarity is Enough:A New Method for KBQA Tasks

Knowledge Base Question Answering (KBQA) aims to answer natural language...
10/06/2020

Incorporating Behavioral Hypotheses for Query Generation

Generative neural networks have been shown effective on query suggestion...
10/19/2022

Forging Multiple Training Objectives for Pre-trained Language Models via Meta-Learning

Multiple pre-training objectives fill the vacancy of the understanding c...
10/30/2020

VECO: Variable Encoder-decoder Pre-training for Cross-lingual Understanding and Generation

Recent studies about learning multilingual representations have achieved...