ACDC: Weight Sharing in Atom-Coefficient Decomposed Convolution

09/04/2020
by   Ze Wang, et al.
15

Convolutional Neural Networks (CNNs) are known to be significantly over-parametrized, and difficult to interpret, train and adapt. In this paper, we introduce a structural regularization across convolutional kernels in a CNN. In our approach, each convolution kernel is first decomposed as 2D dictionary atoms linearly combined by coefficients. The widely observed correlation and redundancy in a CNN hint a common low-rank structure among the decomposed coefficients, which is here further supported by our empirical observations. We then explicitly regularize CNN kernels by enforcing decomposed coefficients to be shared across sub-structures, while leaving each sub-structure only its own dictionary atoms, a few hundreds of parameters typically, which leads to dramatic model reductions. We explore models with sharing across different sub-structures to cover a wide range of trade-offs between parameter reduction and expressiveness. Our proposed regularized network structures open the door to better interpreting, training and adapting deep models. We validate the flexibility and compatibility of our method by image classification experiments on multiple datasets and underlying network structures, and show that CNNs now maintain performance with dramatic reduction in parameters and computations, e.g., only 5% parameters are used in a ResNet-18 to achieve comparable performance. Further experiments on few-shot classification show that faster and more robust task adaptation is obtained in comparison with models with standard convolutions.

READ FULL TEXT
research
05/14/2020

PENNI: Pruned Kernel Sharing for Efficient CNN Inference

Although state-of-the-art (SOTA) CNNs achieve outstanding performance on...
research
08/06/2020

Structured Convolutions for Efficient Neural Network Design

In this work, we tackle model efficiency by exploiting redundancy in the...
research
04/17/2018

IGCV2: Interleaved Structured Sparse Convolutional Neural Networks

In this paper, we study the problem of designing efficient convolutional...
research
06/11/2020

Multigrid-in-Channels Architectures for Wide Convolutional Neural Networks

We present a multigrid approach that combats the quadratic growth of the...
research
02/12/2018

DCFNet: Deep Neural Network with Decomposed Convolutional Filters

Filters in a Convolutional Neural Network (CNN) contain model parameters...
research
08/07/2018

Efficient Fusion of Sparse and Complementary Convolutions for Object Recognition and Detection

We propose a new method for exploiting sparsity in convolutional kernels...
research
11/20/2016

LCNN: Lookup-based Convolutional Neural Network

Porting state of the art deep learning algorithms to resource constraine...

Please sign up or login with your details

Forgot password? Click here to reset