DeepAI
Log In Sign Up

Spectral Pruning for Recurrent Neural Networks

05/23/2021
by   Takashi Furuya, et al.
0

Pruning techniques for neural networks with a recurrent architecture, such as the recurrent neural network (RNN), are strongly desired for their application to edge-computing devices. However, the recurrent architecture is generally not robust to pruning because even small pruning causes accumulation error and the total error increases significantly over time. In this paper, we propose an appropriate pruning algorithm for RNNs inspired by "spectral pruning", and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/30/2019

One-Shot Pruning of Recurrent Neural Networks by Jacobian Spectrum Evaluation

Recent advances in the sparse neural network literature have made it pos...
06/30/2020

Understanding Diversity based Pruning of Neural Networks – Statistical Mechanical Analysis

Deep learning architectures with a huge number of parameters are often c...
06/17/2019

Structured Pruning of Recurrent Neural Networks through Neuron Selection

Recurrent neural networks (RNNs) have recently achieved remarkable succe...
05/11/2020

CSB-RNN: A Faster-than-Realtime RNN Acceleration Framework with Compressed Structured Blocks

Recurrent neural networks (RNNs) have been widely adopted in temporal se...
08/06/2021

Path classification by stochastic linear recurrent neural networks

We investigate the functioning of a classifying biological neural networ...
02/06/2016

Strongly-Typed Recurrent Neural Networks

Recurrent neural networks are increasing popular models for sequential l...
11/30/2020

Robust error bounds for quantised and pruned neural networks

With the rise of smartphones and the internet-of-things, data is increas...