Interspace Pruning: Using Adaptive Filter Representations to Improve Training of Sparse CNNs

03/15/2022
by   Paul Wimmer, et al.
0

Unstructured pruning is well suited to reduce the memory footprint of convolutional neural networks (CNNs), both at training and inference time. CNNs contain parameters arranged in K × K filters. Standard unstructured pruning (SP) reduces the memory footprint of CNNs by setting filter elements to zero, thereby specifying a fixed subspace that constrains the filter. Especially if pruning is applied before or during training, this induces a strong bias. To overcome this, we introduce interspace pruning (IP), a general tool to improve existing pruning methods. It uses filters represented in a dynamic interspace by linear combinations of an underlying adaptive filter basis (FB). For IP, FB coefficients are set to zero while un-pruned coefficients and FBs are trained jointly. In this work, we provide mathematical evidence for IP's superior performance and demonstrate that IP outperforms SP on all tested state-of-the-art unstructured pruning methods. Especially in challenging situations, like pruning for ImageNet or pruning to high sparsity, IP greatly exceeds SP with equal runtime and parameter costs. Finally, we show that advances of IP are due to improved trainability and superior generalization ability.

READ FULL TEXT
research
08/21/2018

Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks

This paper proposed a Soft Filter Pruning (SFP) method to accelerate the...
research
04/05/2023

Efficient CNNs via Passive Filter Pruning

Convolutional neural networks (CNNs) have shown state-of-the-art perform...
research
03/01/2023

Structured Pruning for Deep Convolutional Neural Networks: A survey

The remarkable performance of deep Convolutional neural networks (CNNs) ...
research
05/06/2020

Dependency Aware Filter Pruning

Convolutional neural networks (CNNs) are typically over-parameterized, b...
research
05/05/2023

Compressing audio CNNs with graph centrality based filter pruning

Convolutional neural networks (CNNs) are commonplace in high-performing ...
research
05/13/2019

Implicit Filter Sparsification In Convolutional Neural Networks

We show implicit filter level sparsity manifests in convolutional neural...
research
11/18/2018

RePr: Improved Training of Convolutional Filters

A well-trained Convolutional Neural Network can easily be pruned without...

Please sign up or login with your details

Forgot password? Click here to reset