Levi Graph AMR Parser using Heterogeneous Attention

07/09/2021
by   Han He, et al.
0

Coupled with biaffine decoders, transformers have been effectively adapted to text-to-graph transduction and achieved state-of-the-art performance on AMR parsing. Many prior works, however, rely on the biaffine decoder for either or both arc and label predictions although most features used by the decoder may be learned by the transformer already. This paper presents a novel approach to AMR parsing by combining heterogeneous data (tokens, concepts, labels) as one input to a transformer to learn attention, and use only attention matrices from the transformer to predict all elements in AMR graphs (concepts, arcs, labels). Although our models use significantly fewer parameters than the previous state-of-the-art graph parser, they show similar or better accuracy on AMR 2.0 and 3.0.

READ FULL TEXT
research
11/06/2016

Deep Biaffine Attention for Neural Dependency Parsing

This paper builds off recent work from Kiperwasser & Goldberg (2016) usi...
research
05/03/2023

Approximating CKY with Transformers

We investigate the ability of transformer models to approximate the CKY ...
research
06/09/2021

Making Better Use of Bilingual Information for Cross-Lingual AMR Parsing

Abstract Meaning Representation (AMR) is a rooted, labeled, acyclic grap...
research
11/10/2019

Rethinking Self-Attention: An Interpretable Self-Attentive Encoder-Decoder Parser

Attention mechanisms have improved the performance of NLP tasks while pr...
research
10/18/2018

Reduction of Parameter Redundancy in Biaffine Classifiers with Symmetric and Circulant Weight Matrices

Currently, the biaffine classifier has been attracting attention as a me...
research
05/01/2020

Combining predicate transformer semantics for effects: a case study in parsing regular languages

This paper describes how to verify a parser for regular expressions in a...
research
07/13/2021

Visual Parser: Representing Part-whole Hierarchies with Transformers

Human vision is able to capture the part-whole hierarchical information ...

Please sign up or login with your details

Forgot password? Click here to reset