Rethinking Depthwise Separable Convolutions: How Intra-Kernel Correlations Lead to Improved MobileNets

03/30/2020
by   Daniel Haase, et al.
0

We introduce blueprint separable convolutions (BSConv) as highly efficient building blocks for CNNs. They are motivated by quantitative analyses of kernel properties from trained models, which show the dominance of correlations along the depth axis. Based on our findings, we formulate a theoretical foundation from which we derive efficient implementations using only standard layers. Moreover, our approach provides a thorough theoretical derivation, interpretation, and justification for the application of depthwise separable convolutions (DSCs) in general, which have become the basis of many modern network architectures. Ultimately, we reveal that DSC-based architectures such as MobileNets implicitly rely on cross-kernel correlations, while our BSConv formulation is based on intra-kernel correlations and thus allows for a more efficient separation of regular convolutions. Extensive experiments on large-scale and fine-grained classification datasets show that BSConvs clearly and consistently improve MobileNets and other DSC-based architectures without introducing any further complexity. For fine-grained datasets, we achieve an improvement of up to 13.7 percentage points. In addition, if used as drop-in replacement for standard architectures such as ResNets, BSConv variants also outperform their vanilla counterparts by up to 9.5 percentage points on ImageNet.

READ FULL TEXT

page 2

page 3

research
01/16/2017

Towards a New Interpretation of Separable Convolutions

In recent times, the use of separable convolutions in deep convolutional...
research
11/21/2019

Fast Sparse ConvNets

Historically, the pursuit of efficient inference has been one of the dri...
research
12/24/2021

Fast 2D Convolutions and Cross-Correlations Using Scalable Architectures

The manuscript describes fast and scalable architectures and associated ...
research
02/27/2020

XSepConv: Extremely Separated Convolution

Depthwise convolution has gradually become an indispensable operation fo...
research
11/19/2019

General E(2)-Equivariant Steerable CNNs

The big empirical success of group equivariant networks has led in recen...
research
02/12/2021

Depthwise Separable Convolutions Allow for Fast and Memory-Efficient Spectral Normalization

An increasing number of models require the control of the spectral norm ...

Please sign up or login with your details

Forgot password? Click here to reset