Lookup-Table Recurrent Language Models for Long Tail Speech Recognition

04/09/2021
by   W. Ronny Huang, et al.
0

We introduce Lookup-Table Language Models (LookupLM), a method for scaling up the size of RNN language models with only a constant increase in the floating point operations, by increasing the expressivity of the embedding table. In particular, we instantiate an (additional) embedding table which embeds the previous n-gram token sequence, rather than a single token. This allows the embedding table to be scaled up arbitrarily – with a commensurate increase in performance – without changing the token vocabulary. Since embeddings are sparsely retrieved from the table via a lookup; increasing the size of the table adds neither extra operations to each forward pass nor extra parameters that need to be stored on limited GPU/TPU memory. We explore scaling n-gram embedding tables up to nearly a billion parameters. When trained on a 3-billion sentence corpus, we find that LookupLM improves long tail log perplexity by 2.44 and long tail WER by 23.4 standard RNN language model baseline, an improvement comparable to a scaling up the baseline by 6.2x the number of floating point operations.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset