Compressing RNNs for IoT devices by 15-38x using Kronecker Products

06/07/2019
by   Urmish Thakker, et al.
0

Recurrent Neural Networks (RNN) can be large and compute-intensive, making them hard to deploy on resource constrained devices. As a result, there is a need for compression technique that can significantly compress recurrent neural networks, without negatively impacting task accuracy. This paper introduces a method to compress RNNs for resource constrained environments using Kronecker products. We call the RNNs compressed using Kronecker products as Kronecker product Recurrent Neural Networks (KPRNNs). KPRNNs can compress the LSTM[22], GRU [9] and parameter optimized FastRNN [30] layers by 15 - 38x with minor loss in accuracy and can act as in-place replacement of most RNN cells in existing applications. By quantizing the Kronecker compressed networks to 8 bits, we further push the compression factor to 50x. We compare the accuracy and runtime of KPRNNs with other state-of-the-art compression techniques across 5 benchmarks spanning 3 different applications, showing its generality. Additionally, we show how to control the compression factors achieved by Kronecker products using a novel hybrid decomposition technique. We call the RNN cells compressed using Kronecker products with this control mechanism as hybrid Kronecker product RNNs (HKPRNN). Using HKPRNN, we compress RNN Cells in 2 benchmarks by 10x and 20x achieving better accuracy than other state-of-the-art compression techniques.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2019

Pushing the limits of RNN Compression

Recurrent Neural Networks (RNN) can be difficult to deploy on resource c...
research
04/20/2020

Towards deep neural network compression via learnable wavelet transforms

Wavelets are well known for data compression, yet have rarely been appli...
research
09/22/2017

BreathRNNet: Breathing Based Authentication on Resource-Constrained IoT Devices using RNNs

Recurrent neural networks (RNNs) have shown promising results in audio a...
research
07/20/2020

DiffRNN: Differential Verification of Recurrent Neural Networks

Recurrent neural networks (RNNs) such as Long Short Term Memory (LSTM) n...
research
06/12/2019

Run-Time Efficient RNN Compression for Inference on Edge Devices

Recurrent neural networks can be large and compute-intensive, yet many a...
research
06/04/2018

Dynamically Hierarchy Revolution: DirNet for Compressing Recurrent Neural Network on Mobile Devices

Recurrent neural networks (RNNs) achieve cutting-edge performance on a v...
research
01/15/2022

Large-Scale Inventory Optimization: A Recurrent-Neural-Networks-Inspired Simulation Approach

Many large-scale production networks include thousands types of final pr...

Please sign up or login with your details

Forgot password? Click here to reset