MINT: Deep Network Compression via Mutual Information-based Neuron Trimming

03/18/2020
by   Madan Ravi Ganesh, et al.
24

Most approaches to deep neural network compression via pruning either evaluate a filter's importance using its weights or optimize an alternative objective function with sparsity constraints. While these methods offer a useful way to approximate contributions from similar filters, they often either ignore the dependency between layers or solve a more difficult optimization objective than standard cross-entropy. Our method, Mutual Information-based Neuron Trimming (MINT), approaches deep compression via pruning by enforcing sparsity based on the strength of the relationship between filters of adjacent layers, across every pair of layers. The relationship is calculated using conditional geometric mutual information which evaluates the amount of similar information exchanged between the filters using a graph-based criterion. When pruning a network, we ensure that retained filters contribute the majority of the information towards succeeding layers which ensures high performance. Our novel approach outperforms existing state-of-the-art compression-via-pruning methods on the standard benchmarks for this task: MNIST, CIFAR-10, and ILSVRC2012, across a variety of network architectures. In addition, we discuss our observations of a common denominator between our pruning methodology's response to adversarial attacks and calibration statistics when compared to the original network.

READ FULL TEXT

page 1

page 7

page 8

research
02/22/2022

HRel: Filter Pruning based on High Relevance between Activation Maps and Class Labels

This paper proposes an Information Bottleneck theory based filter prunin...
research
06/22/2020

Slimming Neural Networks using Adaptive Connectivity Scores

There are two broad approaches to deep neural network (DNN) pruning: 1) ...
research
10/06/2020

Comprehensive Online Network Pruning via Learnable Scaling Factors

One of the major challenges in deploying deep neural network architectur...
research
04/18/2018

Understanding Individual Neuron Importance Using Information Theory

In this work, we characterize the outputs of individual neurons in a tra...
research
03/11/2022

Improve Convolutional Neural Network Pruning by Maximizing Filter Variety

Neural network pruning is a widely used strategy for reducing model stor...
research
09/17/2023

Conditional Mutual Information Constrained Deep Learning for Classification

The concepts of conditional mutual information (CMI) and normalized cond...
research
03/21/2023

Performance-aware Approximation of Global Channel Pruning for Multitask CNNs

Global channel pruning (GCP) aims to remove a subset of channels (filter...

Please sign up or login with your details

Forgot password? Click here to reset