DeepAI
Log In Sign Up

Comparison of Lattice-Free and Lattice-Based Sequence Discriminative Training Criteria for LVCSR

07/01/2019
by   Wilfried Michel, et al.
0

Sequence discriminative training criteria have long been a standard tool in automatic speech recognition for improving the performance of acoustic models over their maximum likelihood / cross entropy trained counterparts. While previously a lattice approximation of the search space has been necessary to reduce computational complexity, recently proposed methods use other approximations to dispense of the need for the computationally expensive step of separate lattice creation. In this work we present a memory efficient implementation of the forward-backward computation that allows us to use uni-gram word-level language models in the denominator calculation while still doing a full summation on GPU. This allows for a direct comparison of lattice-based and lattice-free sequence discriminative training criteria such as MMI and sMBR, both using the same language model during training. We compared performance, speed of convergence, and stability on large vocabulary continuous speech recognition tasks like Switchboard and Quaero. We found that silence modeling seriously impacts the performance in the lattice-free case and needs special treatment. In our experiments lattice-free MMI comes on par with its lattice-based counterpart. Lattice-based sMBR still outperforms all lattice-free training criteria.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/26/2018

A novel pyramidal-FSMN architecture with lattice-free MMI for speech recognition

Deep Feedforward Sequential Memory Network (DFSMN) has shown superior pe...
04/06/2021

LT-LM: a novel non-autoregressive language model for single-shot lattice rescoring

Neural network-based language models are commonly used in rescoring appr...
10/17/2022

A Treatise On FST Lattice Based MMI Training

Maximum mutual information (MMI) has become one of the two de facto meth...
11/11/2021

Self-Normalized Importance Sampling for Neural Language Modeling

To mitigate the problem of having to traverse over the full vocabulary i...
04/21/2021

On Sampling-Based Training Criteria for Neural Language Modeling

As the vocabulary size of modern word-based language models becomes ever...
03/08/2021

A Parallelizable Lattice Rescoring Strategy with Neural Language Models

This paper proposes a parallel computation strategy and a posterior-base...