ℓ_0 Regularized Structured Sparsity Convolutional Neural Networks

12/17/2019
by   Kevin Bui, et al.
0

Deepening and widening convolutional neural networks (CNNs) significantly increases the number of trainable weight parameters by adding more convolutional layers and feature maps per layer, respectively. By imposing inter- and intra-group sparsity onto the weights of the layers during the training process, a compressed network can be obtained with accuracy comparable to a dense one. In this paper, we propose a new variant of sparse group lasso that blends the ℓ_0 norm onto the individual weight parameters and the ℓ_2,1 norm onto the output channels of a layer. To address the non-differentiability of the ℓ_0 norm, we apply variable splitting resulting in an algorithm that consists of executing stochastic gradient descent followed by hard thresholding for each iteration. Numerical experiments are demonstrated on LeNet-5 and wide-residual-networks for MNIST and CIFAR 10/100, respectively. They showcase the effectiveness of our proposed method in attaining superior test accuracy with network sparsification on par with the current state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2020

Layer Pruning via Fusible Residual Convolutional Block for Deep Neural Networks

In order to deploy deep convolutional neural networks (CNNs) on resource...
research
01/28/2019

Convolutional Neural Networks with Layer Reuse

A convolutional layer in a Convolutional Neural Network (CNN) consists o...
research
08/09/2019

Group Pruning using a Bounded-Lp norm for Group Gating and Regularization

Deep neural networks achieve state-of-the-art results on several tasks w...
research
12/10/2018

Reliable Identification of Redundant Kernels for Convolutional Neural Network Compression

To compress deep convolutional neural networks (CNNs) with large memory ...
research
08/08/2020

Representation Learning via Cauchy Convolutional Sparse Coding

In representation learning, Convolutional Sparse Coding (CSC) enables un...
research
09/12/2019

A Channel-Pruned and Weight-Binarized Convolutional Neural Network for Keyword Spotting

We study channel number reduction in combination with weight binarizatio...
research
11/19/2019

Inter-layer Collision Networks

Deeper neural networks are hard to train. Inspired by the elastic collis...

Please sign up or login with your details

Forgot password? Click here to reset