Compression of Acoustic Event Detection Models with Low-rank Matrix Factorization and Quantization Training

05/02/2019
by   Bowen Shi, et al.
0

In this paper, we present a compression approach based on the combination of low-rank matrix factorization and quantization training, to reduce complexity for neural network based acoustic event detection (AED) models. Our experimental results show this combined compression approach is very effective. For a three-layer long short-term memory (LSTM) based AED model, the original model size can be reduced to 1 enables the feasibility of deploying AED for resource-constraint applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2019

On the Effectiveness of Low-Rank Matrix Factorization for LSTM Model Compression

Despite their ubiquity in NLP tasks, Long Short-Term Memory (LSTM) netwo...
research
11/01/2018

Online Embedding Compression for Text Classification using Low Rank Matrix Factorization

Deep learning models have become state of the art for natural language p...
research
06/25/2023

Low-Rank Prune-And-Factorize for Language Model Compression

The components underpinning PLMs – large weight matrices – were shown to...
research
10/06/2020

Rank and run-time aware compression of NLP Applications

Sequence model based NLP applications can be large. Yet, many applicatio...
research
09/07/2023

Low-rank Matrix Sensing With Dithered One-Bit Quantization

We explore the impact of coarse quantization on low-rank matrix sensing ...
research
02/08/2020

Supervised Quantile Normalization for Low-rank Matrix Approximation

Low rank matrix factorization is a fundamental building block in machine...
research
10/01/2020

Deep matrix factorizations

Constrained low-rank matrix approximations have been known for decades a...

Please sign up or login with your details

Forgot password? Click here to reset