Weight Evolution: Improving Deep Neural Networks Training through Evolving Inferior Weight Values

10/09/2021
by   Zhenquan Lin, et al.
0

To obtain good performance, convolutional neural networks are usually over-parameterized. This phenomenon has stimulated two interesting topics: pruning the unimportant weights for compression and reactivating the unimportant weights to make full use of network capability. However, current weight reactivation methods usually reactivate the entire filters, which may not be precise enough. Looking back in history, the prosperity of filter pruning is mainly due to its friendliness to hardware implementation, but pruning at a finer structure level, i.e., weight elements, usually leads to better network performance. We study the problem of weight element reactivation in this paper. Motivated by evolution, we select the unimportant filters and update their unimportant elements by combining them with the important elements of important filters, just like gene crossover to produce better offspring, and the proposed method is called weight evolution (WE). WE is mainly composed of four strategies. We propose a global selection strategy and a local selection strategy and combine them to locate the unimportant filters. A forward matching strategy is proposed to find the matched important filters and a crossover strategy is proposed to utilize the important elements of the important filters for updating unimportant filters. WE is plug-in to existing network architectures. Comprehensive experiments show that WE outperforms the other reactivation methods and plug-in training methods with typical convolutional neural networks, especially lightweight networks. Our code is available at https://github.com/BZQLin/Weight-evolution.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2020

Softer Pruning, Incremental Regularization

Network pruning is widely used to compress Deep Neural Networks (DNNs). ...
research
08/21/2018

Soft Filter Pruning for Accelerating Deep Convolutional Neural Networks

This paper proposed a Soft Filter Pruning (SFP) method to accelerate the...
research
01/20/2021

Non-Parametric Adaptive Network Pruning

Popular network pruning algorithms reduce redundant information by optim...
research
04/24/2020

Convolution-Weight-Distribution Assumption: Rethinking the Criteria of Channel Pruning

Channel pruning is one of the most important techniques for compressing ...
research
04/22/2019

Towards Learning of Filter-Level Heterogeneous Compression of Convolutional Neural Networks

Recently, deep learning has become a de facto standard in machine learni...
research
12/18/2017

Dynamic Weight Alignment for Convolutional Neural Networks

In this paper, we propose a method of improving Convolutional Neural Net...
research
03/24/2021

Dynamic Slimmable Network

Current dynamic networks and dynamic pruning methods have shown their pr...

Please sign up or login with your details

Forgot password? Click here to reset