EAPruning: Evolutionary Pruning for Vision Transformers and CNNs

10/01/2022
by   Qingyuan Li, et al.
0

Structured pruning greatly eases the deployment of large neural networks in resource-constrained environments. However, current methods either involve strong domain expertise, require extra hyperparameter tuning, or are restricted only to a specific type of network, which prevents pervasive industrial applications. In this paper, we undertake a simple and effective approach that can be easily applied to both vision transformers and convolutional neural networks. Specifically, we consider pruning as an evolution process of sub-network structures that inherit weights through reconstruction techniques. We achieve a 50 and 1.34x speedup respectively. For DeiT-Base, we reach nearly 40 reduction and 1.4x speedup. Our code will be made available.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset