Channel Gating Neural Networks

05/29/2018
by   Weizhe Hua, et al.
0

Employing deep neural networks to obtain state-of-the-art performance on computer vision tasks can consume billions of floating point operations and several Joules of energy per evaluation. Network pruning, which statically removes unnecessary features and weights, has emerged as a promising way to reduce this computation cost. In this paper, we propose channel gating, a dynamic, fine-grained, training-based computation-cost-reduction scheme. Channel gating works by identifying the regions in the features which contribute less to the classification result and turning off a subset of the channels for computing the pixels within these uninteresting regions. Unlike static network pruning, the channel gating optimizes computations exploiting characteristics specific to each input at run-time. We show experimentally that applying channel gating in state-of-the-art networks can achieve 66 reduction in FLOPs with 0.22 CIFAR-100 datasets, respectively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2020

Channel Pruning Guided by Spatial and Channel Attention for DNNs in Intelligent Edge Computing

Deep Neural Networks (DNNs) have achieved remarkable success in many com...
research
04/01/2023

Progressive Channel-Shrinking Network

Currently, salience-based channel pruning makes continuous breakthroughs...
research
10/22/2020

AutoPruning for Deep Neural Network with Dynamic Channel Masking

Modern deep neural network models are large and computationally intensiv...
research
11/04/2022

Soft Masking for Cost-Constrained Channel Pruning

Structured channel pruning has been shown to significantly accelerate in...
research
05/15/2019

Dynamic Neural Network Channel Execution for Efficient Training

Existing methods for reducing the computational burden of neural network...
research
03/23/2023

CP^3: Channel Pruning Plug-in for Point-based Networks

Channel pruning can effectively reduce both computational cost and memor...
research
03/28/2022

SPIQ: Data-Free Per-Channel Static Input Quantization

Computationally expensive neural networks are ubiquitous in computer vis...

Please sign up or login with your details

Forgot password? Click here to reset