Log In Sign Up

Spectral Pruning for Recurrent Neural Networks

by   Takashi Furuya, et al.

Pruning techniques for neural networks with a recurrent architecture, such as the recurrent neural network (RNN), are strongly desired for their application to edge-computing devices. However, the recurrent architecture is generally not robust to pruning because even small pruning causes accumulation error and the total error increases significantly over time. In this paper, we propose an appropriate pruning algorithm for RNNs inspired by "spectral pruning", and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with existing methods.


page 1

page 2

page 3

page 4


One-Shot Pruning of Recurrent Neural Networks by Jacobian Spectrum Evaluation

Recent advances in the sparse neural network literature have made it pos...

Understanding Diversity based Pruning of Neural Networks – Statistical Mechanical Analysis

Deep learning architectures with a huge number of parameters are often c...

Structured Pruning of Recurrent Neural Networks through Neuron Selection

Recurrent neural networks (RNNs) have recently achieved remarkable succe...

CSB-RNN: A Faster-than-Realtime RNN Acceleration Framework with Compressed Structured Blocks

Recurrent neural networks (RNNs) have been widely adopted in temporal se...

Path classification by stochastic linear recurrent neural networks

We investigate the functioning of a classifying biological neural networ...

Strongly-Typed Recurrent Neural Networks

Recurrent neural networks are increasing popular models for sequential l...

Robust error bounds for quantised and pruned neural networks

With the rise of smartphones and the internet-of-things, data is increas...