Dynamic nsNet2: Efficient Deep Noise Suppression with Early Exiting

08/31/2023
by   Riccardo Miccini, et al.
0

Although deep learning has made strides in the field of deep noise suppression, leveraging deep architectures on resource-constrained devices still proved challenging. Therefore, we present an early-exiting model based on nsNet2 that provides several levels of accuracy and resource savings by halting computations at different stages. Moreover, we adapt the original architecture by splitting the information flow to take into account the injected dynamism. We show the trade-offs between performance and computational complexity based on established metrics.

READ FULL TEXT
research
06/17/2022

Binary Early-Exit Network for Adaptive Inference on Low-Resource Devices

Deep neural networks have significantly improved performance on a range ...
research
04/21/2021

Measuring what Really Matters: Optimizing Neural Networks for TinyML

With the surge of inexpensive computational and memory resources, neural...
research
10/23/2018

NestDNN: Resource-Aware Multi-Tenant On-Device Deep Learning for Continuous Mobile Vision

Mobile vision systems such as smartphones, drones, and augmented-reality...
research
04/23/2023

Exploring Challenges of Deploying BERT-based NLP Models in Resource-Constrained Embedded Devices

BERT-based neural architectures have established themselves as popular s...
research
09/18/2019

Diversity-enabled sweet spots in layered architectures and speed-accuracy trade-offs in sensorimotor control

Nervous systems sense, communicate, compute, and actuate movement using ...
research
03/10/2022

Realizing Implicit Computational Complexity

This abstract aims at presenting an ongoing effort to apply a novel typi...
research
02/13/2023

Stitchable Neural Networks

The public model zoo containing enormous powerful pretrained model famil...

Please sign up or login with your details

Forgot password? Click here to reset