Compact Neural Networks via Stacking Designed Basic Units

05/03/2022
by   Weichao Lan, et al.
0

Unstructured pruning has the limitation of dealing with the sparse and irregular weights. By contrast, structured pruning can help eliminate this drawback but it requires complex criterion to determine which components to be pruned. To this end, this paper presents a new method termed TissueNet, which directly constructs compact neural networks with fewer weight parameters by independently stacking designed basic units, without requiring additional judgement criteria anymore. Given the basic units of various architectures, they are combined and stacked in a certain form to build up compact neural networks. We formulate TissueNet in diverse popular backbones for comparison with the state-of-the-art pruning methods on different benchmark datasets. Moreover, two new metrics are proposed to evaluate compression performance. Experiment results show that TissueNet can achieve comparable classification accuracy while saving up to around 80 stacking basic units provides a new promising way for network compression.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2021

Pruning of Convolutional Neural Networks Using Ising Energy Model

Pruning is one of the major methods to compress deep neural networks. In...
research
05/26/2021

Dynamic Probabilistic Pruning: A general framework for hardware-constrained pruning at different granularities

Unstructured neural network pruning algorithms have achieved impressive ...
research
11/19/2019

DARB: A Density-Aware Regular-Block Pruning for Deep Neural Networks

The rapidly growing parameter volume of deep neural networks (DNNs) hind...
research
11/10/2020

Dirichlet Pruning for Neural Network Compression

We introduce Dirichlet pruning, a novel post-processing technique to tra...
research
07/09/2021

Structured Model Pruning of Convolutional Networks on Tensor Processing Units

The deployment of convolutional neural networks is often hindered by hig...
research
06/15/2018

Detecting Dead Weights and Units in Neural Networks

Deep Neural Networks are highly over-parameterized and the size of the n...
research
11/28/2017

WSNet: Compact and Efficient Networks with Weight Sampling

We present a new approach and a novel architecture, termed WSNet, for le...

Please sign up or login with your details

Forgot password? Click here to reset