Exact Hard Monotonic Attention for Character-Level Transduction

05/15/2019
by   Shijie Wu, et al.
0

Many common character-level, string-to-string transduction tasks, e.g., grapheme-to-phoneme conversion and morphological inflection, consist almost exclusively of monotonic transduction. Neural sequence-to-sequence models with soft attention, non-monotonic models, outperform popular monotonic models. In this work, we ask the following question: Is monotonicity really a helpful inductive bias in these tasks? We develop a hard attention sequence-to-sequence model that enforces strict monotonicity and learns alignment jointly. With the help of dynamic programming, we are able to compute the exact marginalization over all alignments. Our models achieve state-of-the-art performance on morphological inflection. Furthermore, we find strong performance on two other character-level transduction tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2018

Hard Non-Monotonic Attention for Character-Level Transduction

Character-level string-to-string transduction is an important component ...
research
04/08/2021

On Biasing Transformer Attention Towards Monotonicity

Many sequence-to-sequence tasks in natural language processing are rough...
research
05/20/2020

Applying the Transformer to Character-level Transduction

The transformer has been shown to outperform recurrent neural network-ba...
research
11/04/2016

Morphological Inflection Generation with Hard Monotonic Attention

We present a neural model for morphological inflection generation which ...
research
02/16/2021

Searching for Search Errors in Neural Morphological Inflection

Neural sequence-to-sequence models are currently the predominant choice ...
research
06/03/2019

Robust Sequence-to-Sequence Acoustic Modeling with Stepwise Monotonic Attention for Neural TTS

Neural TTS has demonstrated strong capabilities to generate human-like s...
research
08/31/2018

Imitation Learning for Neural Morphological String Transduction

We employ imitation learning to train a neural transition-based string t...

Please sign up or login with your details

Forgot password? Click here to reset