Model Interpolation with Trans-dimensional Random Field Language Models for Speech Recognition

03/30/2016
by   Bin Wang, et al.
0

The dominant language models (LMs) such as n-gram and neural network (NN) models represent sentence probabilities in terms of conditionals. In contrast, a new trans-dimensional random field (TRF) LM has been recently introduced to show superior performances, where the whole sentence is modeled as a random field. In this paper, we examine how the TRF models can be interpolated with the NN models, and obtain 12.1% and 17.9% relative error rate reductions over 6-gram LMs for English and Chinese speech recognition respectively through log-linear combination.

READ FULL TEXT

page 1

page 2

page 3

research
06/23/2016

NN-grams: Unifying neural network and n-gram language models for Speech Recognition

We present NN-grams, a novel, hybrid language model integrating n-grams ...
research
02/14/2020

Integrating Discrete and Neural Features via Mixed-feature Trans-dimensional Random Field Language Models

There has been a long recognition that discrete features (n-gram feature...
research
10/30/2017

Learning neural trans-dimensional random field language models with noise-contrastive estimation

Trans-dimensional random field language models (TRF LMs) where sentences...
research
07/03/2018

Improved training of neural trans-dimensional random field language models with dynamic noise-contrastive estimation

A new whole-sentence language model - neural trans-dimensional random fi...
research
07/23/2017

Language modeling with Neural trans-dimensional random fields

Trans-dimensional random field language models (TRF LMs) have recently b...
research
11/11/2019

Long-span language modeling for speech recognition

We explore neural language modeling for speech recognition where the con...
research
06/21/2021

Computational Pronunciation Analysis in Sung Utterances

Recent automatic lyrics transcription (ALT) approaches focus on building...

Please sign up or login with your details

Forgot password? Click here to reset