Shift-based Primitives for Efficient Convolutional Neural Networks

09/22/2018
by   Huasong Zhong, et al.
0

We propose a collection of three shift-based primitives for building efficient compact CNN-based networks. These three primitives (channel shift, address shift, shortcut shift) can reduce the inference time on GPU while maintains the prediction accuracy. These shift-based primitives only moves the pointer but avoids memory copy, thus very fast. For example, the channel shift operation is 12.7x faster compared to channel shuffle in ShuffleNet but achieves the same accuracy. The address shift and channel shift can be merged into the point-wise group convolution and invokes only a single kernel call, taking little time to perform spatial convolution and channel shift. Shortcut shift requires no time to realize residual connection through allocating space in advance. We blend these shift-based primitives with point-wise group convolution and built two inference-efficient CNN architectures named AddressNet and Enhanced AddressNet. Experiments on CIFAR100 and ImageNet datasets show that our models are faster and achieve comparable or better accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/13/2019

All You Need is a Few Shifts: Designing Efficient Convolutional Neural Networks for Image Classification

Shift operation is an efficient alternative over depthwise separable con...
research
10/22/2019

4-Connected Shift Residual Networks

The shift operation was recently introduced as an alternative to spatial...
research
03/19/2023

Evaluation of Convolution Primitives for Embedded Neural Networks on 32-bit Microcontrollers

Deploying neural networks on constrained hardware platforms such as 32-b...
research
07/02/2020

Channel Compression: Rethinking Information Redundancy among Channels in CNN Architecture

Model compression and acceleration are attracting increasing attentions ...
research
03/07/2018

HENet:A Highly Efficient Convolutional Neural Networks Optimized for Accuracy, Speed and Storage

In order to enhance the real-time performance of convolutional neural ne...
research
11/22/2017

Shift: A Zero FLOP, Zero Parameter Alternative to Spatial Convolutions

Neural networks rely on convolutions to aggregate spatial information. H...
research
11/24/2020

MicroNet: Towards Image Recognition with Extremely Low FLOPs

In this paper, we present MicroNet, which is an efficient convolutional ...

Please sign up or login with your details

Forgot password? Click here to reset