A Comprehensive Evaluation of the Copy Mechanism for Natural Language to SPARQL Query Generation

04/16/2023
by   Samuel Reyd, et al.
0

In recent years, the field of neural machine translation (NMT) for SPARQL query generation has witnessed a significant growth. Recently, the incorporation of the copy mechanism with traditional encoder-decoder architectures and the use of pre-trained encoder-decoders have set new performance benchmarks. This paper presents a large variety of experiments that replicate and expand upon recent NMT-based SPARQL generation studies, comparing pre-trained and non-pre-trained models, question annotation formats, and the use of a copy mechanism for non-pre-trained and pre-trained models. Our results show that either adding the copy mechanism or using a question annotation improves performances for nonpre-trained models and for pre-trained models, setting new baselines for three popular datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/09/2021

BERT, mBERT, or BiBERT? A Study on Contextualized Embeddings for Neural Machine Translation

The success of bidirectional encoders using masked language models, such...
research
11/18/2022

A Copy Mechanism for Handling Knowledge Base Elements in SPARQL Neural Machine Translation

Neural Machine Translation (NMT) models from English to SPARQL are a pro...
research
12/17/2022

Better Datastore, Better Translation: Generating Datastores from Pre-Trained Models for Nearest Neural Machine Translation

Nearest Neighbor Machine Translation (kNNMT) is a simple and effective m...
research
09/07/2021

IndicBART: A Pre-trained Model for Natural Language Generation of Indic Languages

In this paper we present IndicBART, a multilingual, sequence-to-sequence...
research
09/06/2023

Matcha-TTS: A fast TTS architecture with conditional flow matching

We introduce Matcha-TTS, a new encoder-decoder architecture for speedy T...
research
02/25/2023

MetaAID 2.0: An Extensible Framework for Developing Metaverse Applications via Human-controllable Pre-trained Models

Pre-trained models (PM) have achieved promising results in content gener...
research
11/03/2020

Sound Natural: Content Rephrasing in Dialog Systems

We introduce a new task of rephrasing for a more natural virtual assista...

Please sign up or login with your details

Forgot password? Click here to reset