DeepAI AI Chat
Log In Sign Up

Disfluency Detection using a Bidirectional LSTM

by   Vicky Zayats, et al.
University of Washington

We introduce a new approach for disfluency detection using a Bidirectional Long-Short Term Memory neural network (BLSTM). In addition to the word sequence, the model takes as input pattern match features that were developed to reduce sensitivity to vocabulary size in training, which lead to improved performance over the word sequence alone. The BLSTM takes advantage of explicit repair states in addition to the standard reparandum states. The final output leverages integer linear programming to incorporate constraints of disfluency structure. In experiments on the Switchboard corpus, the model achieves state-of-the-art performance for both the standard disfluency detection task and the correction detection task. Analysis shows that the model has better detection of non-repetition disfluencies, which tend to be much harder to detect.


page 1

page 2

page 3

page 4


Long Short-Term Memory for Japanese Word Segmentation

This study presents a Long Short-Term Memory (LSTM) neural network appro...

Encoder-decoder with Focus-mechanism for Sequence Labelling Based Spoken Language Understanding

This paper investigates the framework of encoder-decoder with attention ...

Word Sense Disambiguation using a Bidirectional LSTM

In this paper we present a clean, yet effective, model for word sense di...

Grid Long Short-Term Memory

This paper introduces Grid Long Short-Term Memory, a network of LSTM cel...

Twitter Bot Detection Using Bidirectional Long Short-term Memory Neural Networks and Word Embeddings

Twitter is a web application playing dual roles of online social network...

Choosing the Right Word: Using Bidirectional LSTM Tagger for Writing Support Systems

Scientific writing is difficult. It is even harder for those for whom En...

An Improved Learning Framework for Covariant Local Feature Detection

Learning feature detection has been largely an unexplored area when comp...