SEP-Nets: Small and Effective Pattern Networks

06/13/2017
by   Zhe Li, et al.
0

While going deeper has been witnessed to improve the performance of convolutional neural networks (CNN), going smaller for CNN has received increasing attention recently due to its attractiveness for mobile/embedded applications. It remains an active and important topic how to design a small network while retaining the performance of large and deep CNNs (e.g., Inception Nets, ResNets). Albeit there are already intensive studies on compressing the size of CNNs, the considerable drop of performance is still a key concern in many designs. This paper addresses this concern with several new contributions. First, we propose a simple yet powerful method for compressing the size of deep CNNs based on parameter binarization. The striking difference from most previous work on parameter binarization/quantization lies at different treatments of 1× 1 convolutions and k× k convolutions (k>1), where we only binarize k× k convolutions into binary patterns. The resulting networks are referred to as pattern networks. By doing this, we show that previous deep CNNs such as GoogLeNet and Inception-type Nets can be compressed dramatically with marginal drop in performance. Second, in light of the different functionalities of 1× 1 (data projection/transformation) and k× k convolutions (pattern extraction), we propose a new block structure codenamed the pattern residual block that adds transformed feature maps generated by 1× 1 convolutions to the pattern feature maps generated by k× k convolutions, based on which we design a small network with ∼ 1 million parameters. Combining with our parameter binarization, we achieve better performance on ImageNet than using similar sized networks including recently released Google MobileNets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2018

ChannelNets: Compact and Efficient Convolutional Neural Networks via Channel-Wise Convolutions

Convolutional neural networks (CNNs) have shown great capability of solv...
research
11/22/2018

NeuroTreeNet: A New Method to Explore Horizontal Expansion Network

It is widely recognized that the deeper networks or networks with more f...
research
04/15/2021

AsymmNet: Towards ultralight convolution neural networks using asymmetrical bottlenecks

Deep convolutional neural networks (CNN) have achieved astonishing resul...
research
03/11/2019

Accuracy Booster: Performance Boosting using Feature Map Re-calibration

Convolution Neural Networks (CNN) have been extremely successful in solv...
research
11/28/2017

Learning Spatio-Temporal Representation with Pseudo-3D Residual Networks

Convolutional Neural Networks (CNN) have been regarded as a powerful cla...
research
09/29/2020

Deep discriminant analysis for task-dependent compact network search

Most of today's popular deep architectures are hand-engineered for gener...
research
08/18/2020

Feature Products Yield Efficient Networks

We introduce Feature-Product networks (FP-nets) as a novel deep-network ...

Please sign up or login with your details

Forgot password? Click here to reset