A Simple and Strong Baseline for End-to-End Neural RST-style Discourse Parsing

10/15/2022
by   Naoki Kobayashi, et al.
0

To promote and further develop RST-style discourse parsing models, we need a strong baseline that can be regarded as a reference for reporting reliable experimental results. This paper explores a strong baseline by integrating existing simple parsing strategies, top-down and bottom-up, with various transformer-based pre-trained language models. The experimental results obtained from two benchmark datasets demonstrate that the parsing performance strongly relies on the pretrained language models rather than the parsing strategies. In particular, the bottom-up parser achieves large performance gains compared to the current best parser when employing DeBERTa. We further reveal that language models with a span-masking scheme especially boost the parsing performance through our analysis within intra- and multi-sentential parsing, and nuclearity prediction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2017

Joint Syntacto-Discourse Parsing and the Syntacto-Discourse Treebank

Discourse parsing has long been treated as a stand-alone problem indepen...
research
10/18/2022

Less is More: Simplifying Feature Extractors Prevents Overfitting for Neural Discourse Parsing Models

Complex feature extractors are widely employed for text representation b...
research
11/06/2020

Unleashing the Power of Neural Discourse Parsers – A Context and Structure Aware Approach Using Large Scale Pretraining

RST-based discourse parsing is an important NLP task with numerous downs...
research
05/17/2018

Linear-Time Constituency Parsing with RNNs and Dynamic Programming

Recently, span-based constituency parsing has achieved competitive accur...
research
09/02/2020

A Simple Global Neural Discourse Parser

Discourse parsing is largely dominated by greedy parsers with manually-d...
research
04/30/2020

Exploring Contextualized Neural Language Models for Temporal Dependency Parsing

Extracting temporal relations between events and time expressions has ma...
research
12/07/2021

Parsing with Pretrained Language Models, Multiple Datasets, and Dataset Embeddings

With an increase of dataset availability, the potential for learning fro...

Please sign up or login with your details

Forgot password? Click here to reset