Something Old, Something New: Grammar-based CCG Parsing with Transformer Models

09/21/2021
by   Stephen Clark, et al.
0

This report describes the parsing problem for Combinatory Categorial Grammar (CCG), showing how a combination of Transformer-based neural models and a symbolic CCG grammar can lead to substantial gains over existing approaches. The report also documents a 20-year research program, showing how NLP methods have evolved over this time. The staggering accuracy improvements provided by neural models for CCG parsing can be seen as a reflection of the improvements seen in NLP more generally. The report provides a minimal introduction to CCG and CCG parsing, with many pointers to the relevant literature. It then describes the CCG supertagging problem, and some recent work from Tian et al. (2020) which applies Transformer-based models to supertagging with great effect. I use this existing model to develop a CCG multitagger, which can serve as a front-end to an existing CCG parser. Simply using this new multitagger provides substantial gains in parsing accuracy. I then show how a Transformer-based model from the parsing literature can be combined with the grammar-based CCG parser, setting a new state-of-the-art for the CCGbank parsing task of almost 93 sentence accuracies of over 50

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2017

A generalized parsing framework for Abstract Grammars

This technical report presents a general framework for parsing a variety...
research
12/12/2018

Systematic Parsing of X.509: Eradicating Security Issues with a Parse Tree

X.509 certificate parsing and validation is a critical task which has sh...
research
05/03/2023

Approximating CKY with Transformers

We investigate the ability of transformer models to approximate the CKY ...
research
06/24/2017

Encoder-Decoder Shift-Reduce Syntactic Parsing

Starting from NMT, encoder-decoder neu- ral networks have been used for ...
research
10/19/2020

Heads-up! Unsupervised Constituency Parsing via Self-Attention Heads

Transformer-based pre-trained language models (PLMs) have dramatically i...
research
08/29/2018

Grammar Induction with Neural Language Models: An Unusual Replication

A substantial thread of recent work on latent tree learning has attempte...
research
10/14/2022

A Second Wave of UD Hebrew Treebanking and Cross-Domain Parsing

Foundational Hebrew NLP tasks such as segmentation, tagging and parsing,...

Please sign up or login with your details

Forgot password? Click here to reset