Learning compositional structures for semantic graph parsing

06/08/2021
by   Jonas Groschwitz, et al.
0

AM dependency parsing is a method for neural semantic graph parsing that exploits the principle of compositionality. While AM dependency parsers have been shown to be fast and accurate across several graphbanks, they require explicit annotations of the compositional tree structures for training. In the past, these were obtained using complex graphbank-specific heuristics written by experts. Here we show how they can instead be trained directly on the graphs with a neural latent-variable model, drastically reducing the amount and complexity of manual heuristics. We demonstrate that our model picks up on several linguistic phenomena on its own and achieves comparable accuracy to supervised training, greatly facilitating the use of AM dependency parsing for new sembanks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2018

AMR Dependency Parsing with a Typed Semantic Algebra

We present a semantic parser for Abstract Meaning Representations which ...
research
04/17/2018

Learning Joint Semantic Parsers from Disjoint Data

We present a new approach to learning semantic parsers from multiple dat...
research
02/28/2023

Is Japanese CCGBank empirically correct? A case study of passive and causative constructions

The Japanese CCGBank serves as training and evaluation data for developi...
research
01/19/2020

Learning Compositional Neural Information Fusion for Human Parsing

This work proposes to combine neural networks with the compositional hie...
research
04/29/2020

Normalizing Compositional Structures Across Graphbanks

The emergence of a variety of graph-based meaning representations (MRs) ...
research
02/06/2018

Non-Projective Dependency Parsing via Latent Heads Representation (LHR)

In this paper, we introduce a novel approach based on a bidirectional re...
research
08/09/2015

An Automatic Machine Translation Evaluation Metric Based on Dependency Parsing Model

Most of the syntax-based metrics obtain the similarity by comparing the ...

Please sign up or login with your details

Forgot password? Click here to reset