One-Shot Pruning of Recurrent Neural Networks by Jacobian Spectrum Evaluation

11/30/2019
by   Matthew Shunshi Zhang, et al.
0

Recent advances in the sparse neural network literature have made it possible to prune many large feed forward and convolutional networks with only a small quantity of data. Yet, these same techniques often falter when applied to the problem of recovering sparse recurrent networks. These failures are quantitative: when pruned with recent techniques, RNNs typically obtain worse performance than they do under a simple random pruning scheme. The failures are also qualitative: the distribution of active weights in a pruned LSTM or GRU network tend to be concentrated in specific neurons and gates, and not well dispersed across the entire architecture. We seek to rectify both the quantitative and qualitative issues with recurrent network pruning by introducing a new recurrent pruning objective derived from the spectrum of the recurrent Jacobian. Our objective is data efficient (requiring only 64 data points to prune the network), easy to implement, and produces 95 that significantly improve on existing baselines. We evaluate on sequential MNIST, Billion Words, and Wikitext.

READ FULL TEXT
research
05/23/2021

Spectral Pruning for Recurrent Neural Networks

Pruning techniques for neural networks with a recurrent architecture, su...
research
04/17/2017

Exploring Sparsity in Recurrent Neural Networks

Recurrent Neural Networks (RNN) are widely used to solve a variety of pr...
research
06/17/2019

Structured Pruning of Recurrent Neural Networks through Neuron Selection

Recurrent neural networks (RNNs) have recently achieved remarkable succe...
research
01/24/2019

Petrophysical property estimation from seismic data using recurrent neural networks

Reservoir characterization involves the estimation petrophysical propert...
research
04/10/2018

Recurrent Neural Networks for Person Re-identification Revisited

The task of person re-identification has recently received rising attent...
research
04/06/2023

NTK-SAP: Improving neural network pruning by aligning training dynamics

Pruning neural networks before training has received increasing interest...

Please sign up or login with your details

Forgot password? Click here to reset