Log In Sign Up

Learning compositional structures for semantic graph parsing

by   Jonas Groschwitz, et al.

AM dependency parsing is a method for neural semantic graph parsing that exploits the principle of compositionality. While AM dependency parsers have been shown to be fast and accurate across several graphbanks, they require explicit annotations of the compositional tree structures for training. In the past, these were obtained using complex graphbank-specific heuristics written by experts. Here we show how they can instead be trained directly on the graphs with a neural latent-variable model, drastically reducing the amount and complexity of manual heuristics. We demonstrate that our model picks up on several linguistic phenomena on its own and achieves comparable accuracy to supervised training, greatly facilitating the use of AM dependency parsing for new sembanks.


page 1

page 2

page 3

page 4


AMR Dependency Parsing with a Typed Semantic Algebra

We present a semantic parser for Abstract Meaning Representations which ...

Learning Joint Semantic Parsers from Disjoint Data

We present a new approach to learning semantic parsers from multiple dat...

Non-Projective Dependency Parsing via Latent Heads Representation (LHR)

In this paper, we introduce a novel approach based on a bidirectional re...

Learning Compositional Neural Information Fusion for Human Parsing

This work proposes to combine neural networks with the compositional hie...

Normalizing Compositional Structures Across Graphbanks

The emergence of a variety of graph-based meaning representations (MRs) ...

An Automatic Machine Translation Evaluation Metric Based on Dependency Parsing Model

Most of the syntax-based metrics obtain the similarity by comparing the ...

Lambda Dependency-Based Compositional Semantics

This short note presents a new formal language, lambda dependency-based ...