Graph-to-Graph Transformer for Transition-based Dependency Parsing

11/08/2019
by   Alireza Mohammadshahi, et al.
0

Transition-based dependency parsing is a challenging task for conditioning on and predicting structures. We demonstrate state-of-the-art results on this benchmark with the Graph2Graph Transformer architecture. This novel architecture supports both the input and output of arbitrary graphs via its attention mechanism. It can also be integrated both with previous neural network structured prediction techniques and with existing Transformer pre-trained models. Both with and without BERT pretraining, adding dependency graph inputs via the attention mechanism results in significant improvements over previously proposed mechanism for encoding the partial parse tree, resulting in accuracies which improve the state-of-the-art in transition-based dependency parsing, achieving 95.64 WSJ dependencies. Graph2Graph Transformers are not restricted to tree structures and can be easily applied to a wide range of NLP tasks.

READ FULL TEXT

page 4

page 5

research
03/29/2020

Recursive Non-Autoregressive Graph-to-Graph Transformer for Dependency Parsing with Iterative Refinement

We propose the Recursive Non-autoregressive Graph-to-graph Transformer a...
research
10/20/2020

Transition-based Parsing with Stack-Transformers

Modeling the parser state is key to good performance in transition-based...
research
05/27/2020

Transition-based Semantic Dependency Parsing with Pointer Networks

Transition-based parsers implemented with Pointer Networks have become t...
research
11/22/2019

TreeGen: A Tree-Based Transformer Architecture for Code Generation

A code generation system generates programming language code based on an...
research
08/30/2019

Hierarchical Pointer Net Parsing

Transition-based top-down parsing with pointer networks has achieved sta...
research
06/09/2022

Unveiling Transformers with LEGO: a synthetic reasoning task

We propose a synthetic task, LEGO (Learning Equality and Group Operation...
research
10/16/2022

Improving Semantic Matching through Dependency-Enhanced Pre-trained Model with Adaptive Fusion

Transformer-based pre-trained models like BERT have achieved great progr...

Please sign up or login with your details

Forgot password? Click here to reset