DAIS: Automatic Channel Pruning via Differentiable Annealing Indicator Search

11/04/2020
by   Yushuo Guan, et al.
1

The convolutional neural network has achieved great success in fulfilling computer vision tasks despite large computation overhead against efficient deployment. Structured (channel) pruning is usually applied to reduce the model redundancy while preserving the network structure, such that the pruned network can be easily deployed in practice. However, existing structured pruning methods require hand-crafted rules which may lead to tremendous pruning space. In this paper, we introduce Differentiable Annealing Indicator Search (DAIS) that leverages the strength of neural architecture search in the channel pruning and automatically searches for the effective pruned model with given constraints on computation overhead. Specifically, DAIS relaxes the binarized channel indicators to be continuous and then jointly learns both indicators and model parameters via bi-level optimization. To bridge the non-negligible discrepancy between the continuous model and the target binarized model, DAIS proposes an annealing-based procedure to steer the indicator convergence towards binarized states. Moreover, DAIS designs various regularizations based on a priori structural knowledge to control the pruning sparsity and to improve model performance. Experimental results show that DAIS outperforms state-of-the-art pruning methods on CIFAR-10, CIFAR-100, and ImageNet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2020

Differentiable Channel Pruning Search

In this paper, we propose the differentiable channel pruning search (DCP...
research
11/24/2019

Exploiting Operation Importance for Differentiable Neural Architecture Search

Recently, differentiable neural architecture search methods significantl...
research
08/19/2021

An Information Theory-inspired Strategy for Automatic Network Pruning

Despite superior performance on many computer vision tasks, deep convolu...
research
07/14/2022

PR-DARTS: Pruning-Based Differentiable Architecture Search

The deployment of Convolutional Neural Networks (CNNs) on edge devices i...
research
06/02/2022

Pruning-as-Search: Efficient Neural Architecture Search via Channel Pruning and Structural Reparameterization

Neural architecture search (NAS) and network pruning are widely studied ...
research
10/08/2022

Advancing Model Pruning via Bi-level Optimization

The deployment constraints in practical applications necessitate the pru...
research
11/12/2019

CALPA-NET: Channel-pruning-assisted Deep Residual Network for Steganalysis of Digital Images

Over the past few years, detection performance improvements of deep-lear...

Please sign up or login with your details

Forgot password? Click here to reset