Pushing the limits of RNN Compression

10/04/2019
by   Urmish Thakker, et al.
0

Recurrent Neural Networks (RNN) can be difficult to deploy on resource constrained devices due to their size. As a result, there is a need for compression techniques that can significantly compress RNNs without negatively impacting task accuracy. This paper introduces a method to compress RNNs for resource constrained environments using Kronecker product (KP). KPs can compress RNN layers by 16-38x with minimal accuracy loss. We show that KP can beat the task accuracy achieved by other state-of-the-art compression techniques (pruning and low-rank matrix factorization) across 4 benchmarks spanning 3 different applications, while simultaneously improving inference run-time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2019

Compressing RNNs for IoT devices by 15-38x using Kronecker Products

Recurrent Neural Networks (RNN) can be large and compute-intensive, maki...
research
01/08/2019

FastGRNN: A Fast, Accurate, Stable and Tiny Kilobyte Sized Gated Recurrent Neural Network

This paper develops the FastRNN and FastGRNN algorithms to address the t...
research
06/12/2019

Run-Time Efficient RNN Compression for Inference on Edge Devices

Recurrent neural networks can be large and compute-intensive, yet many a...
research
09/22/2017

BreathRNNet: Breathing Based Authentication on Resource-Constrained IoT Devices using RNNs

Recurrent neural networks (RNNs) have shown promising results in audio a...
research
06/04/2018

Dynamically Hierarchy Revolution: DirNet for Compressing Recurrent Neural Network on Mobile Devices

Recurrent neural networks (RNNs) achieve cutting-edge performance on a v...
research
10/02/2019

AntMan: Sparse Low-Rank Compression to Accelerate RNN inference

Wide adoption of complex RNN based models is hindered by their inference...
research
12/02/2016

Parameter Compression of Recurrent Neural Networks and Degradation of Short-term Memory

The significant computational costs of deploying neural networks in larg...

Please sign up or login with your details

Forgot password? Click here to reset