DSA: More Efficient Budgeted Pruning via Differentiable Sparsity Allocation

04/05/2020
by   Xuefei Ning, et al.
2

Budgeted pruning is the problem of pruning under resource constraints. In budgeted pruning, how to distribute the resources across layers (i.e., sparsity allocation) is the key problem. Traditional methods solve it by discretely searching for the layer-wise pruning ratios, which lacks efficiency. In this paper, we propose Differentiable Sparsity Allocation (DSA), an efficient end-to-end budgeted pruning flow. Utilizing a novel differentiable pruning process, DSA finds the layer-wise pruning ratios with gradient-based optimization. It allocates sparsity in continuous space, which is more efficient than methods based on discrete evaluation and search. Furthermore, DSA could work in a pruning-from-scratch manner, whereas traditional budgeted pruning methods are applied to pre-trained models. Experimental results on CIFAR-10 and ImageNet show that DSA could achieve superior performance than current iterative budgeted pruning methods, and shorten the time cost of the overall pruning process by at least 1.5x in the meantime.

READ FULL TEXT
research
09/27/2019

Global Sparse Momentum SGD for Pruning Very Deep Neural Networks

Deep Neural Network (DNN) is powerful but computationally expensive and ...
research
07/17/2023

Differentiable Transportation Pruning

Deep learning algorithms are increasingly employed at the edge. However,...
research
03/30/2020

DHP: Differentiable Meta Pruning via HyperNetworks

Network pruning has been the driving force for the efficient inference o...
research
02/28/2020

Learned Threshold Pruning

This paper presents a novel differentiable method for unstructured weigh...
research
11/10/2018

Using NonBacktracking Expansion to Analyze k-core Pruning Process

We induce the NonBacktracking Expansion Branch method to analyze the k-c...
research
03/09/2022

CP-ViT: Cascade Vision Transformer Pruning via Progressive Sparsity Prediction

Vision transformer (ViT) has achieved competitive accuracy on a variety ...
research
07/30/2020

Growing Efficient Deep Networks by Structured Continuous Sparsification

We develop an approach to training deep networks while dynamically adjus...

Please sign up or login with your details

Forgot password? Click here to reset