Class-Discriminative CNN Compression

10/21/2021
by   Yuchen Liu, et al.
0

Compressing convolutional neural networks (CNNs) by pruning and distillation has received ever-increasing focus in the community. In particular, designing a class-discrimination based approach would be desired as it fits seamlessly with the CNNs training objective. In this paper, we propose class-discriminative compression (CDC), which injects class discrimination in both pruning and distillation to facilitate the CNNs training goal. We first study the effectiveness of a group of discriminant functions for channel pruning, where we include well-known single-variate binary-class statistics like Student's T-Test in our study via an intuitive generalization. We then propose a novel layer-adaptive hierarchical pruning approach, where we use a coarse class discrimination scheme for early layers and a fine one for later layers. This method naturally accords with the fact that CNNs process coarse semantics in the early layers and extract fine concepts at the later. Moreover, we leverage discriminant component analysis (DCA) to distill knowledge of intermediate representations in a subspace with rich discriminative information, which enhances hidden layers' linear separability and classification accuracy of the student. Combining pruning and distillation, CDC is evaluated on CIFAR and ILSVRC 2012, where we consistently outperform the state-of-the-art results.

READ FULL TEXT

page 8

page 16

research
04/29/2020

Rethinking Class-Discrimination Based CNN Channel Pruning

Channel pruning has received ever-increasing focus on network compressio...
research
08/31/2021

AIP: Adversarial Iterative Pruning Based on Knowledge Transfer for Convolutional Neural Networks

With the increase of structure complexity, convolutional neural networks...
research
04/01/2022

Structured Pruning Learns Compact and Accurate Models

The growing size of neural language models has led to increased attentio...
research
08/16/2020

Cascaded channel pruning using hierarchical self-distillation

In this paper, we propose an approach for filter-level pruning with hier...
research
09/30/2021

Deep Neural Compression Via Concurrent Pruning and Self-Distillation

Pruning aims to reduce the number of parameters while maintaining perfor...
research
05/28/2020

A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

Network pruning has become the de facto tool to accelerate deep neural n...

Please sign up or login with your details

Forgot password? Click here to reset