Spatial-Winograd Pruning Enabling Sparse Winograd Convolution

01/08/2019
by   Jiecao Yu, et al.
0

Deep convolutional neural networks (CNNs) are deployed in various applications but demand immense computational requirements. Pruning techniques and Winograd convolution are two typical methods to reduce the CNN computation. However, they cannot be directly combined because Winograd transformation fills in the sparsity resulting from pruning. Li et al. (2017) propose sparse Winograd convolution in which weights are directly pruned in the Winograd domain, but this technique is not very practical because Winograd-domain retraining requires low learning rates and hence significantly longer training time. Besides, Liu et al. (2018) move the ReLU function into the Winograd domain, which can help increase the weight sparsity but requires changes in the network structure. To achieve a high Winograd-domain weight sparsity without changing network structures, we propose a new pruning method, spatial-Winograd pruning. As the first step, spatial-domain weights are pruned in a structured way, which efficiently transfers the spatial-domain sparsity into the Winograd domain and avoids Winograd-domain retraining. For the next step, we also perform pruning and retraining directly in the Winograd domain but propose to use an importance factor matrix to adjust weight importance and weight gradients. This adjustment makes it possible to effectively retrain the pruned Winograd-domain network without changing the network structure. For the three models on the datasets of CIFAR10, CIFAR-100, and ImageNet, our proposed method can achieve the Winograd domain sparsities of 63

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2018

Efficient Sparse-Winograd Convolutional Neural Networks

Convolutional Neural Networks (CNNs) are computationally intensive, whic...
research
10/23/2022

Pushing the Efficiency Limit Using Structured Sparse Convolutions

Weight pruning is among the most popular approaches for compressing deep...
research
09/10/2021

Dynamic Collective Intelligence Learning: Finding Efficient Sparse Model via Refined Gradients for Pruned Weights

With the growth of deep neural networks (DNN), the number of DNN paramet...
research
11/12/2020

When deep learning models on GPU can be accelerated by taking advantage of unstructured sparsity

This paper is focused on the improvement the efficiency of the sparse co...
research
03/20/2023

Induced Feature Selection by Structured Pruning

The advent of sparsity inducing techniques in neural networks has been o...
research
05/13/2019

Implicit Filter Sparsification In Convolutional Neural Networks

We show implicit filter level sparsity manifests in convolutional neural...
research
11/30/2020

Deconstructing the Structure of Sparse Neural Networks

Although sparse neural networks have been studied extensively, the focus...

Please sign up or login with your details

Forgot password? Click here to reset