ABC-KD: Attention-Based-Compression Knowledge Distillation for Deep Learning-Based Noise Suppression

05/26/2023
by   Yixin Wan, et al.
0

Noise suppression (NS) models have been widely applied to enhance speech quality. Recently, Deep Learning-Based NS, which we denote as Deep Noise Suppression (DNS), became the mainstream NS method due to its excelling performance over traditional ones. However, DNS models face 2 major challenges for supporting the real-world applications. First, high-performing DNS models are usually large in size, causing deployment difficulties. Second, DNS models require extensive training data, including noisy audios as inputs and clean audios as labels. It is often difficult to obtain clean labels for training DNS models. We propose the use of knowledge distillation (KD) to resolve both challenges. Our study serves 2 main purposes. To begin with, we are among the first to comprehensively investigate mainstream KD techniques on DNS models to resolve the two challenges. Furthermore, we propose a novel Attention-Based-Compression KD method that outperforms all investigated mainstream KD frameworks on DNS task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2020

Knowledge Distillation in Deep Learning and its Applications

Deep learning based models are relatively large, and it is hard to deplo...
research
11/21/2022

Blind Knowledge Distillation for Robust Image Classification

Optimizing neural networks with noisy labels is a challenging task, espe...
research
09/21/2021

Knowledge Distillation with Noisy Labels for Natural Language Understanding

Knowledge Distillation (KD) is extensively used to compress and deploy l...
research
02/28/2020

An Efficient Method of Training Small Models for Regression Problems with Knowledge Distillation

Compressing deep neural network (DNN) models becomes a very important an...
research
07/31/2018

A Robust Deep Attention Network to Noisy Labels in Semi-supervised Biomedical Segmentation

Learning-based methods suffer from limited clean annotations, especially...
research
10/27/2021

Temporal Knowledge Distillation for On-device Audio Classification

Improving the performance of on-device audio classification models remai...
research
08/15/2019

Resolving challenges in deep learning-based analyses of histopathological images using explanation methods

Deep learning has recently gained popularity in digital pathology due to...

Please sign up or login with your details

Forgot password? Click here to reset