Low-resource Low-footprint Wake-word Detection using Knowledge Distillation

07/06/2022
by   Arindam Ghosh, et al.
0

As virtual assistants have become more diverse and specialized, so has the demand for application or brand-specific wake words. However, the wake-word-specific datasets typically used to train wake-word detectors are costly to create. In this paper, we explore two techniques to leverage acoustic modeling data for large-vocabulary speech recognition to improve a purpose-built wake-word detector: transfer learning and knowledge distillation. We also explore how these techniques interact with time-synchronous training targets to improve detection latency. Experiments are presented on the open-source "Hey Snips" dataset and a more challenging in-house far-field dataset. Using phone-synchronous targets and knowledge distillation from a large acoustic model, we are able to improve accuracy across dataset sizes for both datasets while reducing latency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2022

Using Knowledge Distillation to improve interpretable models in a retail banking context

This article sets forth a review of knowledge distillation techniques wi...
research
11/15/2017

Apprentice: Using Knowledge Distillation Techniques To Improve Low-Precision Network Accuracy

Deep learning networks have achieved state-of-the-art accuracies on comp...
research
12/16/2021

Knowledge Distillation Leveraging Alternative Soft Targets from Non-Parallel Qualified Speech Data

This paper describes a novel knowledge distillation framework that lever...
research
12/20/2019

The State of Knowledge Distillation for Classification

We survey various knowledge distillation (KD) strategies for simple clas...
research
12/08/2022

Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

To circumvent the non-parallelizability of recurrent neural network-base...
research
07/12/2022

Distilled Non-Semantic Speech Embeddings with Binary Neural Networks for Low-Resource Devices

This work introduces BRILLsson, a novel binary neural network-based repr...
research
08/24/2023

Fall Detection using Knowledge Distillation Based Long short-term memory for Offline Embedded and Low Power Devices

This paper presents a cost-effective, low-power approach to unintentiona...

Please sign up or login with your details

Forgot password? Click here to reset