Multi-Frame Cross-Entropy Training for Convolutional Neural Networks in Speech Recognition

07/29/2019
by   Tom Sercu, et al.
0

We introduce Multi-Frame Cross-Entropy training (MFCE) for convolutional neural network acoustic models. Recognizing that similar to RNNs, CNNs are in nature sequence models that take variable length inputs, we propose to take as input to the CNN a part of an utterance long enough that multiple labels are predicted at once, therefore getting cross-entropy loss signal from multiple adjacent frames. This increases the amount of label information drastically for small marginal computational cost. We show large WER improvements on hub5 and rt02 after training on the 2000-hour Switchboard benchmark.

READ FULL TEXT
research
12/15/2020

Neural Collapse with Cross-Entropy Loss

We consider the variational problem of cross-entropy loss with n feature...
research
11/27/2020

Statistical theory for image classification using deep convolutional neural networks with cross-entropy loss

Convolutional neural networks learned by minimizing the cross-entropy lo...
research
10/21/2019

Model Order Selection in DoA Scenarios via Cross-Entropy based Machine Learning Techniques

In this paper, we present a machine learning approach for estimating the...
research
11/23/2016

Tunable Sensitivity to Large Errors in Neural Network Training

When humans learn a new concept, they might ignore examples that they ca...
research
03/25/2022

Chain-based Discriminative Autoencoders for Speech Recognition

In our previous work, we proposed a discriminative autoencoder (DcAE) fo...
research
06/02/2020

Cross entropy as objective function for music generative models

The election of the function to optimize when training a machine learnin...
research
05/01/2023

SafeWebUH at SemEval-2023 Task 11: Learning Annotator Disagreement in Derogatory Text: Comparison of Direct Training vs Aggregation

Subjectivity and difference of opinion are key social phenomena, and it ...

Please sign up or login with your details

Forgot password? Click here to reset