DeepAI AI Chat
Log In Sign Up

Text Normalization using Memory Augmented Neural Networks

by   Subhojeet Pramanik, et al.
VIT University

We propose a memory augmented neural network to perform text normalization i.e. the transformation of words from the written to the spoken form. With the addition of dynamic memory access and storage mechanism, we present an architecture that will serve as a language agnostic text normalization system while avoiding the kind of silly errors made by the LSTM based recurrent neural architectures. By reducing the number of unacceptable mistakes, we show that such a novel architecture is indeed a better alternative. Our proposed system requires significantly lesser amounts of data, training time and compute resources. However, some occurrences of errors still remain in certain semiotic classes. Nevertheless, we demonstrate that memory augmented networks with meta-learning capabilities can open many doors to a superior text normalization system.


page 1

page 2

page 3

page 4


NeMo Inverse Text Normalization: From Development To Production

Inverse text normalization (ITN) converts spoken-domain automatic speech...

Memory-Augmented Recurrent Neural Networks Can Learn Generalized Dyck Languages

We introduce three memory-augmented Recurrent Neural Networks (MARNNs) a...

Pixel Normalization from Numeric Data as Input to Neural Networks

Text to image transformation for input to neural networks requires inter...

Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists

Language Models (LMs) are important components in several Natural Langua...

RNN Approaches to Text Normalization: A Challenge

This paper presents a challenge to the community: given a large corpus o...

Meta-Learning Neural Bloom Filters

There has been a recent trend in training neural networks to replace dat...

Generalized Key-Value Memory to Flexibly Adjust Redundancy in Memory-Augmented Networks

Memory-augmented neural networks enhance a neural network with an extern...