Improvements to deep convolutional neural networks for LVCSR

09/05/2013
by   Tara N. Sainath, et al.
0

Deep Convolutional Neural Networks (CNNs) are more powerful than Deep Neural Networks (DNN), as they are able to better reduce spectral variation in the input signal. This has also been confirmed experimentally, with CNNs showing improvements in word error rate (WER) between 4-12 across a variety of LVCSR tasks. In this paper, we describe different methods to further improve CNN performance. First, we conduct a deep analysis comparing limited weight sharing and full weight sharing with state-of-the-art features. Second, we apply various pooling strategies that have shown improvements in computer vision to an LVCSR speech task. Third, we introduce a method to effectively incorporate speaker adaptation, namely fMLLR, into log-mel features. Fourth, we introduce an effective strategy to use dropout during Hessian-free sequence training. We find that with these improvements, particularly with fMLLR and dropout, we are able to achieve an additional 2-3 relative improvement in WER on a 50-hour Broadcast News task over our previous best CNN baseline. On a larger 400-hour BN task, we find an additional 4-5 relative improvement over our previous best CNN baseline.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2015

Very Deep Multilingual Convolutional Neural Networks for LVCSR

Convolutional neural networks (CNNs) are a standard component of many cu...
research
10/02/2016

Very Deep Convolutional Neural Networks for Robust Speech Recognition

This paper describes the extension and optimization of our previous work...
research
08/10/2021

Exploiting Features with Split-and-Share Module

Deep convolutional neural networks (CNNs) have shown state-of-the-art pe...
research
04/23/2018

ASR Performance Prediction on Unseen Broadcast Programs using Convolutional Neural Networks

In this paper, we address a relatively new task: prediction of ASR perfo...
research
07/21/2023

Transferability of Convolutional Neural Networks in Stationary Learning Tasks

Recent advances in hardware and big data acquisition have accelerated th...
research
05/15/2023

Theoretical Analysis of Inductive Biases in Deep Convolutional Networks

In this paper, we study the inductive biases in convolutional neural net...

Please sign up or login with your details

Forgot password? Click here to reset