RGAT: A Deeper Look into Syntactic Dependency Information for Coreference Resolution

09/10/2023
by   Yuan Meng, et al.
0

Although syntactic information is beneficial for many NLP tasks, combining it with contextual information between words to solve the coreference resolution problem needs to be further explored. In this paper, we propose an end-to-end parser that combines pre-trained BERT with a Syntactic Relation Graph Attention Network (RGAT) to take a deeper look into the role of syntactic dependency information for the coreference resolution task. In particular, the RGAT model is first proposed, then used to understand the syntactic dependency graph and learn better task-specific syntactic embeddings. An integrated architecture incorporating BERT embeddings and syntactic embeddings is constructed to generate blending representations for the downstream task. Our experiments on a public Gendered Ambiguous Pronouns (GAP) dataset show that with the supervision learning of the syntactic dependency graph and without fine-tuning the entire BERT, we increased the F1-score of the previous best model (RGCN-with-BERT) from 80.3 78.5 demonstrate that the performance of the model is also improved by incorporating syntactic dependency information learned from RGAT.

READ FULL TEXT
research
05/21/2019

Look Again at the Syntax: Relational Graph Convolutional Network for Gendered Ambiguous Pronoun Resolution

Gender bias has been found in existing coreference resolvers. In order t...
research
05/22/2023

GATology for Linguistics: What Syntactic Dependencies It Knows

Graph Attention Network (GAT) is a graph neural network which is one of ...
research
12/09/2020

Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies

Cross-lingual word sense disambiguation (WSD) tackles the challenge of d...
research
10/18/2021

BERMo: What can BERT learn from ELMo?

We propose BERMo, an architectural modification to BERT, which makes pre...
research
05/22/2023

Syntactic Knowledge via Graph Attention with BERT in Machine Translation

Although the Transformer model can effectively acquire context features ...
research
04/30/2020

Universal Dependencies according to BERT: both more specific and more general

This work focuses on analyzing the form and extent of syntactic abstract...
research
12/04/2019

AMUSED: A Multi-Stream Vector Representation Method for Use in Natural Dialogue

The problem of building a coherent and non-monotonous conversational age...

Please sign up or login with your details

Forgot password? Click here to reset