Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

04/09/2020
by   Kakeru Mitsuno, et al.
0

In a deep neural network (DNN), the number of the parameters is usually huge to get high learning performances. For that reason, it costs a lot of memory and substantial computational resources, and also causes overfitting. It is known that some parameters are redundant and can be removed from the network without decreasing performance. Many sparse regularization criteria have been proposed to solve this problem. In a convolutional neural network (CNN), group sparse regularizations are often used to remove unnecessary subsets of the weights, such as filters or channels. When we apply a group sparse regularization for the weights connected to a neuron as a group, each convolution filter is not treated as a target group in the regularization. In this paper, we introduce the concept of hierarchical grouping to solve this problem, and we propose several hierarchical group sparse regularization criteria for CNNs. Our proposed the hierarchical group sparse regularization can treat the weight for the input-neuron or the output-neuron as a group and convolutional filter as a group in the same group to prune the unnecessary subsets of weights. As a result, we can prune the weights more adequately depending on the structure of the network and the number of channels keeping high performance. In the experiment, we investigate the effectiveness of the proposed sparse regularizations through intensive comparison experiments on public datasets with several network architectures. Code is available on GitHub: "https://github.com/K-Mitsuno/hierarchical-group-sparse-regularization"

READ FULL TEXT

page 1

page 6

page 7

research
11/04/2020

Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

Since the convolutional neural networks are often trained with redundant...
research
01/04/2019

Transformed ℓ_1 Regularization for Learning Sparse Deep Neural Networks

Deep neural networks (DNNs) have achieved extraordinary success in numer...
research
07/10/2017

Scale-Regularized Filter Learning

We start out by demonstrating that an elementary learning task, correspo...
research
09/29/2020

Self-grouping Convolutional Neural Networks

Although group convolution operators are increasingly used in deep convo...
research
04/25/2018

Structured Deep Neural Network Pruning by Varying Regularization Parameters

Convolutional Neural Networks (CNN's) are restricted by their massive co...
research
05/17/2023

Adaptive aggregation of Monte Carlo augmented decomposed filters for efficient group-equivariant convolutional neural network

Filter-decomposition-based group-equivariant convolutional neural networ...
research
07/02/2016

Group Sparse Regularization for Deep Neural Networks

In this paper, we consider the joint task of simultaneously optimizing (...

Please sign up or login with your details

Forgot password? Click here to reset