Training Efficient Network Architecture and Weights via Direct Sparsity Control

02/11/2020
by   Yangzi Guo, et al.
0

Artificial neural networks (ANNs) especially deep convolutional networks are very popular these days and have been proved to successfully offer quite reliable solutions to many vision problems. However, the use of deep neural networks is widely impeded by their intensive computational and memory cost. In this paper, we propose a novel efficient network pruning method that is suitable for both non-structured and structured channel-level pruning. Our proposed method tightens a sparsity constraint by gradually removing network parameters or filter channels based on a criterion and a schedule. The attractive fact that the network size keeps dropping throughout the iterations makes it suitable for the pruning of any untrained or pre-trained network. Because our method uses a L0 constraint instead of the L1 penalty, it does not introduce any bias in the training parameters or filter channels. Furthermore, the L0 constraint makes it easy to directly specify the desired sparsity level during the network pruning process. Finally, experimental validation on synthetic and real datasets both show that the proposed method obtains better or competitive performance compared to other states of art network pruning methods.

READ FULL TEXT
research
08/31/2021

Pruning with Compensation: Efficient Channel Pruning for Deep Convolutional Neural Networks

Channel pruning is a promising technique to compress the parameters of d...
research
12/29/2015

Structured Pruning of Deep Convolutional Neural Networks

Real time application of deep learning algorithms is often hindered by h...
research
01/26/2019

PruneTrain: Gradual Structured Pruning from Scratch for Faster Neural Network Training

Model pruning is a popular mechanism to make a network more efficient fo...
research
02/18/2019

Single-shot Channel Pruning Based on Alternating Direction Method of Multipliers

Channel pruning has been identified as an effective approach to construc...
research
09/17/2020

Holistic Filter Pruning for Efficient Deep Neural Networks

Deep neural networks (DNNs) are usually over-parameterized to increase t...
research
10/10/2013

Feature Selection with Annealing for Computer Vision and Big Data Learning

Many computer vision and medical imaging problems are faced with learnin...
research
10/28/2022

Determining Ratio of Prunable Channels in MobileNet by Sparsity for Acoustic Scene Classification

MobileNet is widely used for Acoustic Scene Classification (ASC) in embe...

Please sign up or login with your details

Forgot password? Click here to reset