Spectral Pruning for Recurrent Neural Networks

05/23/2021
by   Takashi Furuya, et al.
0

Pruning techniques for neural networks with a recurrent architecture, such as the recurrent neural network (RNN), are strongly desired for their application to edge-computing devices. However, the recurrent architecture is generally not robust to pruning because even small pruning causes accumulation error and the total error increases significantly over time. In this paper, we propose an appropriate pruning algorithm for RNNs inspired by "spectral pruning", and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with existing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2019

One-Shot Pruning of Recurrent Neural Networks by Jacobian Spectrum Evaluation

Recent advances in the sparse neural network literature have made it pos...
research
06/30/2020

Understanding Diversity based Pruning of Neural Networks – Statistical Mechanical Analysis

Deep learning architectures with a huge number of parameters are often c...
research
06/17/2019

Structured Pruning of Recurrent Neural Networks through Neuron Selection

Recurrent neural networks (RNNs) have recently achieved remarkable succe...
research
05/11/2020

CSB-RNN: A Faster-than-Realtime RNN Acceleration Framework with Compressed Structured Blocks

Recurrent neural networks (RNNs) have been widely adopted in temporal se...
research
08/06/2021

Path classification by stochastic linear recurrent neural networks

We investigate the functioning of a classifying biological neural networ...
research
02/06/2016

Strongly-Typed Recurrent Neural Networks

Recurrent neural networks are increasing popular models for sequential l...
research
07/23/2022

A Taxonomy of Recurrent Learning Rules

Backpropagation through time (BPTT) is the de facto standard for trainin...

Please sign up or login with your details

Forgot password? Click here to reset