CATRO: Channel Pruning via Class-Aware Trace Ratio Optimization

10/21/2021
by   Wenzheng Hu, et al.
12

Deep convolutional neural networks are shown to be overkill with high parametric and computational redundancy in many application scenarios, and an increasing number of works have explored model pruning to obtain lightweight and efficient networks. However, most existing pruning approaches are driven by empirical heuristics and rarely consider the joint impact of channels, leading to unguaranteed and suboptimal performance. In this paper, we propose a novel channel pruning method via class-aware trace ratio optimization (CATRO) to reduce the computational burden and accelerate the model inference. Utilizing class information from a few samples, CATRO measures the joint impact of multiple channels by feature space discriminations and consolidates the layer-wise impact of preserved channels. By formulating channel pruning as a submodular set function maximization problem, CATRO solves it efficiently via a two-stage greedy iterative optimization procedure. More importantly, we present theoretical justifications on convergence and performance of CATRO. Experimental results demonstrate that CATRO achieves higher accuracy with similar computation cost or lower computation cost with similar accuracy than other state-of-the-art channel pruning algorithms. In addition, because of its class-aware property, CATRO is suitable to prune efficient networks adaptively for various classification subtasks, enhancing handy deployment and usage of deep networks in real-world applications.

READ FULL TEXT
research
08/31/2021

Pruning with Compensation: Efficient Channel Pruning for Deep Convolutional Neural Networks

Channel pruning is a promising technique to compress the parameters of d...
research
10/20/2019

Self-Adaptive Network Pruning

Deep convolutional neural networks have been proved successful on a wide...
research
04/06/2019

C2S2: Cost-aware Channel Sparse Selection for Progressive Network Pruning

This paper describes a channel-selection approach for simplifying deep n...
research
08/06/2019

Exploiting Channel Similarity for Accelerating Deep Convolutional Neural Networks

To address the limitations of existing magnitude-based pruning algorithm...
research
11/14/2022

Pruning Very Deep Neural Network Channels for Efficient Inference

In this paper, we introduce a new channel pruning method to accelerate v...
research
10/28/2022

Determining Ratio of Prunable Channels in MobileNet by Sparsity for Acoustic Scene Classification

MobileNet is widely used for Acoustic Scene Classification (ASC) in embe...
research
05/28/2020

A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

Network pruning has become the de facto tool to accelerate deep neural n...

Please sign up or login with your details

Forgot password? Click here to reset