Syntactic Knowledge via Graph Attention with BERT in Machine Translation

05/22/2023
by   Yuqian Dai, et al.
0

Although the Transformer model can effectively acquire context features via a self-attention mechanism, deeper syntactic knowledge is still not effectively modeled. To alleviate the above problem, we propose Syntactic knowledge via Graph attention with BERT (SGB) in Machine Translation (MT) scenarios. Graph Attention Network (GAT) and BERT jointly represent syntactic dependency feature as explicit knowledge of the source language to enrich source language representations and guide target language generation. Our experiments use gold syntax-annotation sentences and Quality Estimation (QE) model to obtain interpretability of translation quality improvement regarding syntactic knowledge without being limited to a BLEU score. Experiments show that the proposed SGB engines improve translation quality across the three MT tasks without sacrificing BLEU scores. We investigate what length of source sentences benefits the most and what dependencies are better identified by the SGB engines. We also find that learning of specific dependency relations by GAT can be reflected in the translation quality containing such relations and that syntax on the graph leads to new modeling of syntactic aspects of source sentences in the middle and bottom layers of BERT.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2023

GATology for Linguistics: What Syntactic Dependencies It Knows

Graph Attention Network (GAT) is a graph neural network which is one of ...
research
11/23/2021

Boosting Neural Machine Translation with Dependency-Scaled Self-Attention Network

The neural machine translation model assumes that syntax knowledge can b...
research
05/17/2023

Bring More Attention to Syntactic Symmetry for Automatic Postediting of High-Quality Machine Translations

Automatic postediting (APE) is an automated process to refine a given ma...
research
12/27/2020

SG-Net: Syntax Guided Transformer for Language Representation

Understanding human language is one of the key themes of artificial inte...
research
10/21/2022

Syntax-guided Localized Self-attention by Constituency Syntactic Distance

Recent works have revealed that Transformers are implicitly learning the...
research
11/05/2021

A Syntax-Guided Grammatical Error Correction Model with Dependency Tree Correction

Grammatical Error Correction (GEC) is a task of detecting and correcting...
research
09/10/2023

RGAT: A Deeper Look into Syntactic Dependency Information for Coreference Resolution

Although syntactic information is beneficial for many NLP tasks, combini...

Please sign up or login with your details

Forgot password? Click here to reset