Grow and Prune Compact, Fast, and AccurateLSTMs

05/30/2018
by   Xiaoliang Dai, et al.
0

Long short-term memory (LSTM) has been widely used for sequential data modeling. Researchers have increased LSTM depth by stacking LSTM cells to improve performance. This incurs model redundancy, increases run-time delay, and makes the LSTMs more prone to overfitting. To address these problems, we propose a hidden-layer LSTM (H-LSTM) that adds hidden layers to LSTM's original one level non-linear control gates. H-LSTM increases accuracy while employing fewer external stacked layers, thus reducing the number of parameters and run-time latency significantly. We employ grow-and-prune (GP) training to iteratively adjust the hidden layers through gradient-based growth and magnitude-based pruning of connections. This learns both the weights and the compact architecture of H-LSTM control gates. We have GP-trained H-LSTMs for image captioning and speech recognition applications. For the NeuralTalk architecture on the MSCOCO dataset, our three models reduce the number of parameters by 38.7x [floating-point operations (FLOPs) by 45.5x], run-time latency by 4.5x, and improve the CIDEr score by 2.6. For the DeepSpeech2 architecture on the AN4 dataset, our two models reduce the number of parameters by 19.4x (FLOPs by 23.5x), run-time latency by 15.7 from 12.9 and accurate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2018

Grow and Prune Compact, Fast, and Accurate LSTMs

Long short-term memory (LSTM) has been widely used for sequential data m...
research
01/30/2019

Hardware-Guided Symbiotic Training for Compact, Accurate, yet Execution-Efficient LSTM

Many long short-term memory (LSTM) applications need fast yet compact mo...
research
09/07/2018

Cell-aware Stacked LSTMs for Modeling Sentences

We propose a method of stacking multiple long short-term memory (LSTM) l...
research
08/02/2021

Improving Deep Learning for HAR with shallow LSTMs

Recent studies in Human Activity Recognition (HAR) have shown that Deep ...
research
03/19/2019

IndyLSTMs: Independently Recurrent LSTMs

We introduce Independently Recurrent Long Short-term Memory cells: IndyL...
research
11/06/2017

NeST: A Neural Network Synthesis Tool Based on a Grow-and-Prune Paradigm

Neural networks (NNs) have begun to have a pervasive impact on various a...
research
01/07/2018

Approximate FPGA-based LSTMs under Computation Time Constraints

Recurrent Neural Networks and in particular Long Short-Term Memory (LSTM...

Please sign up or login with your details

Forgot password? Click here to reset