On Causal Inference for Data-free Structured Pruning

12/19/2021
by   Martin Ferianc, et al.
23

Neural networks (NNs) are making a large impact both on research and industry. Nevertheless, as NNs' accuracy increases, it is followed by an expansion in their size, required number of compute operations and energy consumption. Increase in resource consumption results in NNs' reduced adoption rate and real-world deployment impracticality. Therefore, NNs need to be compressed to make them available to a wider audience and at the same time decrease their runtime costs. In this work, we approach this challenge from a causal inference perspective, and we propose a scoring mechanism to facilitate structured pruning of NNs. The approach is based on measuring mutual information under a maximum entropy perturbation, sequentially propagated through the NN. We demonstrate the method's performance on two datasets and various NNs' sizes, and we show that our approach achieves competitive performance under challenging conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/13/2022

Energy Consumption Analysis of pruned Semantic Segmentation Networks on an Embedded GPU

Deep neural networks are the state of the art in many computer vision ta...
research
02/10/2023

Causal Inference out of Control: Estimating the Steerability of Consumption

Regulators and academics are increasingly interested in the causal effec...
research
06/13/2022

Leveraging Structured Pruning of Convolutional Neural Networks

Structured pruning is a popular method to reduce the cost of convolution...
research
11/13/2017

On the boundary between qualitative and quantitative methods for causal inference

We consider how to quantify the causal effect from a random variable to ...
research
01/22/2021

Baseline Pruning-Based Approach to Trojan Detection in Neural Networks

This paper addresses the problem of detecting trojans in neural networks...
research
01/13/2023

GOHSP: A Unified Framework of Graph and Optimization-based Heterogeneous Structured Pruning for Vision Transformer

The recently proposed Vision transformers (ViTs) have shown very impress...
research
09/07/2018

SECS: Efficient Deep Stream Processing via Class Skew Dichotomy

Despite that accelerating convolutional neural network (CNN) receives an...

Please sign up or login with your details

Forgot password? Click here to reset