Bifocal Neural ASR: Exploiting Keyword Spotting for Inference Optimization

08/03/2021
by   Jonathan Macoskey, et al.
0

We present Bifocal RNN-T, a new variant of the Recurrent Neural Network Transducer (RNN-T) architecture designed for improved inference time latency on speech recognition tasks. The architecture enables a dynamic pivot for its runtime compute pathway, namely taking advantage of keyword spotting to select which component of the network to execute for a given audio frame. To accomplish this, we leverage a recurrent cell we call the Bifocal LSTM (BFLSTM), which we detail in the paper. The architecture is compatible with other optimization strategies such as quantization, sparsification, and applying time-reduction layers, making it especially applicable for deployed, real-time speech recognition settings. We present the architecture and report comparative experimental results on voice-assistant speech recognition tasks. Specifically, we show our proposed Bifocal RNN-T can improve inference cost by 29.1

READ FULL TEXT
research
04/03/2023

Dual-Attention Neural Transducers for Efficient Wake Word Spotting in Speech Recognition

We present dual-attention neural biasing, an architecture designed to bo...
research
08/03/2021

Amortized Neural Networks for Low-Latency Speech Recognition

We introduce Amortized Neural Networks (AmNets), a compute cost- and lat...
research
05/12/2023

Accelerator-Aware Training for Transducer-Based Speech Recognition

Machine learning model weights and activations are represented in full-p...
research
03/15/2017

Convolutional Recurrent Neural Networks for Small-Footprint Keyword Spotting

Keyword spotting (KWS) constitutes a major component of human-technology...
research
10/17/2022

Sub-8-bit quantization for on-device speech recognition: a regularization-free approach

For on-device automatic speech recognition (ASR), quantization aware tra...
research
03/08/2021

An Ultra-low Power RNN Classifier for Always-On Voice Wake-Up Detection Robust to Real-World Scenarios

We present in this paper an ultra-low power (ULP) Recurrent Neural Netwo...
research
10/30/2019

Temporal Feedback Convolutional Recurrent Neural Networks for Keyword Spotting

While end-to-end learning has become a trend in deep learning, the model...

Please sign up or login with your details

Forgot password? Click here to reset