The Power of Sparsity in Convolutional Neural Networks

02/21/2017
by   Soravit Changpinyo, et al.
0

Deep convolutional networks are well-known for their high computational and memory demands. Given limited resources, how does one design a network that balances its size, training time, and prediction accuracy? A surprisingly effective approach to trade accuracy for size and speed is to simply reduce the number of channels in each convolutional layer by a fixed fraction and retrain the network. In many cases this leads to significantly smaller networks with only minimal changes to accuracy. In this paper, we take a step further by empirically examining a strategy for deactivating connections between filters in convolutional layers in a way that allows us to harvest savings both in run-time and memory for many network architectures. More specifically, we generalize 2D convolution to use a channel-wise sparse connection structure and show that this leads to significantly better results than the baseline approach for large networks including VGG and Inception V3.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2017

Learning Efficient Convolutional Networks through Network Slimming

The deployment of deep convolutional neural networks (CNNs) in many real...
research
10/12/2018

Dynamic Channel Pruning: Feature Boosting and Suppression

Making deep convolutional neural networks more accurate typically comes ...
research
08/15/2016

Design of Efficient Convolutional Layers using Single Intra-channel Convolution, Topological Subdivisioning and Spatial "Bottleneck" Structure

Deep convolutional neural networks achieve remarkable visual recognition...
research
04/16/2021

High Performance Convolution Using Sparsity and Patterns for Inference in Deep Convolutional Neural Networks

Deploying deep Convolutional Neural Networks (CNNs) is impacted by their...
research
02/15/2022

DualConv: Dual Convolutional Kernels for Lightweight Deep Neural Networks

CNN architectures are generally heavy on memory and computational requir...
research
12/29/2015

Structured Pruning of Deep Convolutional Neural Networks

Real time application of deep learning algorithms is often hindered by h...
research
07/20/2020

Learning Sparse Filters in Deep Convolutional Neural Networks with a l1/l2 Pseudo-Norm

While deep neural networks (DNNs) have proven to be efficient for numero...

Please sign up or login with your details

Forgot password? Click here to reset