An Improving Framework of regularization for Network Compression

12/11/2019
by   E Zhenqian, et al.
0

Deep Neural Networks have achieved remarkable success relying on the developing high computation capability of GPUs and large-scale datasets with increasing network depth and width in image recognition, object detection and many other applications. However, due to the expensive computation and intensive memory, researchers have concentrated on designing compression methods in recent years. In this paper, we briefly summarize the existing advanced techniques that are useful in model compression at first. After that, we give a detailed description on group lasso regularization and its variants. More importantly, we propose an improving framework of partial regularization based on the relationship between neurons and connections of adjacent layers. It is reasonable and feasible with the help of permutation property of neural network . Experiment results show that partial regularization methods brings improvements such as higher classification accuracy in both training and testing stages on multiple datasets. Since our regularizers contain the computation of less parameters, it shows competitive performances in terms of the total running time of experiments. Finally, we analysed the results and draw a conclusion that the optimal network structure must exist and depend on the input data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/03/2020

A Partial Regularization Method for Network Compression

Deep Neural Networks have achieved remarkable success relying on the dev...
research
01/30/2022

Training Thinner and Deeper Neural Networks: Jumpstart Regularization

Neural networks are more expressive when they have multiple layers. In t...
research
04/15/2018

SparseNet: A Sparse DenseNet for Image Classification

Deep neural networks have made remarkable progresses on various computer...
research
11/11/2019

A Computing Kernel for Network Binarization on PyTorch

Deep Neural Networks have now achieved state-of-the-art results in a wid...
research
04/04/2022

Evolving Neural Selection with Adaptive Regularization

Over-parameterization is one of the inherent characteristics of modern d...
research
03/28/2022

MixNN: A design for protecting deep learning models

In this paper, we propose a novel design, called MixNN, for protecting d...

Please sign up or login with your details

Forgot password? Click here to reset