Structured Convolutions for Efficient Neural Network Design

08/06/2020
by   Yash Bhalgat, et al.
68

In this work, we tackle model efficiency by exploiting redundancy in the implicit structure of the building blocks of convolutional neural networks. We start our analysis by introducing a general definition of Composite Kernel structures that enable the execution of convolution operations in the form of efficient, scaled, sum-pooling components. As its special case, we propose Structured Convolutions and show that these allow decomposition of the convolution operation into a sum-pooling operation followed by a convolution with significantly lower complexity and fewer weights. We show how this decomposition can be applied to 2D and 3D kernels as well as the fully-connected layers. Furthermore, we present a Structural Regularization loss that promotes neural network layers to leverage on this desired structure in a way that, after training, they can be decomposed with negligible performance loss. By applying our method to a wide range of CNN architectures, we demonstrate "structured" versions of the ResNets that are up to 2× smaller and a new Structured-MobileNetV2 that is more efficient while staying within an accuracy loss of 1 We also show similar structured versions of EfficientNet on ImageNet and HRNet architecture for semantic segmentation on the Cityscapes dataset. Our method performs equally well or superior in terms of the complexity reduction in comparison to the existing tensor decomposition and channel pruning methods.

READ FULL TEXT

page 12

page 13

research
04/17/2018

IGCV2: Interleaved Structured Sparse Convolutional Neural Networks

In this paper, we study the problem of designing efficient convolutional...
research
06/12/2020

Multi Layer Neural Networks as Replacement for Pooling Operations

Pooling operations are a layer found in almost every modern neural netwo...
research
09/04/2020

ACDC: Weight Sharing in Atom-Coefficient Decomposed Convolution

Convolutional Neural Networks (CNNs) are known to be significantly over-...
research
03/04/2020

Neural Kernels Without Tangents

We investigate the connections between neural networks and simple buildi...
research
02/11/2021

Uncertainty Propagation in Convolutional Neural Networks: Technical Report

In this technical report we study the problem of propagation of uncertai...
research
08/16/2018

Network Decoupling: From Regular to Depthwise Separable Convolutions

Depthwise separable convolution has shown great efficiency in network de...
research
04/10/2019

Pixel-Adaptive Convolutional Neural Networks

Convolutions are the fundamental building block of CNNs. The fact that t...

Please sign up or login with your details

Forgot password? Click here to reset