Depthwise Multiception Convolution for Reducing Network Parameters without Sacrificing Accuracy

11/07/2020
by   Guoqing Bao, et al.
0

Deep convolutional neural networks have been proven successful in multiple benchmark challenges in recent years. However, the performance improvements are heavily reliant on increasingly complex network architecture and a high number of parameters, which require ever increasing amounts of storage and memory capacity. Depthwise separable convolution (DSConv) can effectively reduce the number of required parameters through decoupling standard convolution into spatial and cross-channel convolution steps. However, the method causes a degradation of accuracy. To address this problem, we present depthwise multiception convolution, termed Multiception, which introduces layer-wise multiscale kernels to learn multiscale representations of all individual input channels simultaneously. We have carried out the experiment on four benchmark datasets, i.e. Cifar-10, Cifar-100, STL-10 and ImageNet32x32, using five popular CNN models, Multiception achieved accuracy promotion in all models and demonstrated higher accuracy performance compared to related works. Meanwhile, Multiception significantly reduces the number of parameters of standard convolution-based models by 32.48

READ FULL TEXT
research
06/11/2020

Multigrid-in-Channels Architectures for Wide Convolutional Neural Networks

We present a multigrid approach that combats the quadratic growth of the...
research
08/05/2018

3D Depthwise Convolution: Reducing Model Parameters in 3D Vision Tasks

Standard 3D convolution operations require much larger amounts of memory...
research
10/21/2019

CPWC: Contextual Point Wise Convolution for Object Recognition

Convolutional layers are a major driving force behind the successes of d...
research
11/17/2018

PydMobileNet: Improved Version of MobileNets with Pyramid Depthwise Separable Convolution

Convolutional neural networks (CNNs) have shown remarkable performance i...
research
09/25/2019

FALCON: Fast and Lightweight Convolution for Compressing and Accelerating CNN

How can we efficiently compress Convolutional Neural Networks (CNN) whil...
research
01/18/2019

Machine Learning with Clos Networks

We present a new methodology for improving the accuracy of small neural ...
research
06/26/2023

Optimized Vectorizing of Building Structures with Swap: High-Efficiency Convolutional Channel-Swap Hybridization Strategy

The building planar graph reconstruction, a.k.a. footprint reconstructio...

Please sign up or login with your details

Forgot password? Click here to reset