All You Need is a Few Shifts: Designing Efficient Convolutional Neural Networks for Image Classification

03/13/2019
by   Weijie Chen, et al.
0

Shift operation is an efficient alternative over depthwise separable convolution. However, it is still bottlenecked by its implementation manner, namely memory movement. To put this direction forward, a new and novel basic component named Sparse Shift Layer (SSL) is introduced in this paper to construct efficient convolutional neural networks. In this family of architectures, the basic block is only composed by 1x1 convolutional layers with only a few shift operations applied to the intermediate feature maps. To make this idea feasible, we introduce shift operation penalty during optimization and further propose a quantization-aware shift learning method to impose the learned displacement more friendly for inference. Extensive ablation studies indicate that only a few shift operations are sufficient to provide spatial information communication. Furthermore, to maximize the role of SSL, we redesign an improved network architecture to Fully Exploit the limited capacity of neural Network (FE-Net). Equipped with SSL, this network can achieve 75.0 top-1 accuracy on ImageNet with only 563M M-Adds. It surpasses other counterparts constructed by depthwise separable convolution and the networks searched by NAS in terms of accuracy and practical speed.

READ FULL TEXT
research
09/22/2018

Shift-based Primitives for Efficient Convolutional Neural Networks

We propose a collection of three shift-based primitives for building eff...
research
12/20/2021

Encoding Hierarchical Information in Neural Networks helps in Subpopulation Shift

Over the past decade, deep neural networks have proven to be adept in im...
research
09/20/2021

GhostShiftAddNet: More Features from Energy-Efficient Operations

Deep convolutional neural networks (CNNs) are computationally and memory...
research
05/29/2019

Attention Based Pruning for Shift Networks

In many application domains such as computer vision, Convolutional Layer...
research
11/22/2017

Shift: A Zero FLOP, Zero Parameter Alternative to Spatial Convolutions

Neural networks rely on convolutions to aggregate spatial information. H...
research
10/17/2022

Defects of Convolutional Decoder Networks in Frequency Representation

In this paper, we prove representation bottlenecks of a cascaded convolu...
research
10/22/2019

4-Connected Shift Residual Networks

The shift operation was recently introduced as an alternative to spatial...

Please sign up or login with your details

Forgot password? Click here to reset