Transition-based Abstract Meaning Representation Parsing with Contextual Embeddings

06/13/2022
by   Yichao Liang, et al.
0

The ability to understand and generate languages sets human cognition apart from other known life forms'. We study a way of combing two of the most successful routes to meaning of language–statistical language models and symbolic semantics formalisms–in the task of semantic parsing. Building on a transition-based, Abstract Meaning Representation (AMR) parser, AmrEager, we explore the utility of incorporating pretrained context-aware word embeddings–such as BERT and RoBERTa–in the problem of AMR parsing, contributing a new parser we dub as AmrBerger. Experiments find these rich lexical features alone are not particularly helpful in improving the parser's overall performance as measured by the SMATCH score when compared to the non-contextual counterpart, while additional concept information empowers the system to outperform the baselines. Through lesion study, we found the use of contextual embeddings helps to make the system more robust against the removal of explicit syntactical features. These findings expose the strength and weakness of the contextual embeddings and the language models in the current form, and motivate deeper understanding thereof.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2017

Robust Incremental Neural Semantic Graph Parsing

Parsing sentences to linguistically-expressive semantic representations ...
research
10/12/2020

HUJI-KU at MRP 2020: Two Transition-based Neural Parsers

This paper describes the HUJI-KU system submission to the shared task on...
research
10/29/2021

Structure-aware Fine-tuning of Sequence-to-sequence Transformers for Transition-based AMR Parsing

Predicting linearized Abstract Meaning Representation (AMR) graphs using...
research
07/02/2019

Neural Semantic Parsing with Anonymization for Command Understanding in General-Purpose Service Robots

Service robots are envisioned to undertake a wide range of tasks at the ...
research
09/18/2019

Improving Natural Language Inference with a Pretrained Parser

We introduce a novel approach to incorporate syntax into natural languag...
research
11/09/2020

A Semantic Framework for PEGs

Parsing Expression Grammars (PEGs) are a recognition-based formalism whi...
research
09/18/2020

Will it Unblend?

Natural language processing systems often struggle with out-of-vocabular...

Please sign up or login with your details

Forgot password? Click here to reset