Neural Constituency Parsing of Speech Transcripts

by   Paria Jamshid Lou, et al.

This paper studies the performance of a neural self-attentive parser on transcribed speech. Speech presents parsing challenges that do not appear in written text, such as the lack of punctuation and the presence of speech disfluencies (including filled pauses, repetitions, corrections, etc.). Disfluencies are especially problematic for conventional syntactic parsers, which typically fail to find any EDITED disfluency nodes at all. This motivated the development of special disfluency detection systems, and special mechanisms added to parsers specifically to handle disfluencies. However, we show here that neural parsers can find EDITED disfluency nodes, and the best neural parsers find them with an accuracy surpassing that of specialized disfluency detection systems, thus making these specialized mechanisms unnecessary. This paper also investigates a modified loss function that puts more weight on EDITED nodes. It also describes tree-transformations that simplify the disfluency detection task by providing alternative encodings of disfluencies and syntactic information.



There are no comments yet.


page 1

page 2

page 3

page 4


On the Role of Style in Parsing Speech with Neural Models

The differences in written text and conversational speech are substantia...

Comparison of Syntactic Parsers on Biomedical Texts

Syntactic parsing is an important step in the automated text analysis wh...

Penn-Helsinki Parsed Corpus of Early Modern English: First Parsing Results and Analysis

We present the first parsing results on the Penn-Helsinki Parsed Corpus ...

Frame-Semantic Parsing with Softmax-Margin Segmental RNNs and a Syntactic Scaffold

We present a new, efficient frame-semantic parser that labels semantic a...

Semantic Role Labeling for Learner Chinese: the Importance of Syntactic Parsing and L2-L1 Parallel Data

This paper studies semantic parsing for interlanguage (L2), taking seman...

Establishing Strong Baselines for the New Decade: Sequence Tagging, Syntactic and Semantic Parsing with BERT

This paper presents new state-of-the-art models for three tasks, part-of...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.