AntiDote: Attention-based Dynamic Optimization for Neural Network Runtime Efficiency

08/14/2020
by   Fuxun Yu, et al.
17

Convolutional Neural Networks (CNNs) achieved great cognitive performance at the expense of considerable computation load. To relieve the computation load, many optimization works are developed to reduce the model redundancy by identifying and removing insignificant model components, such as weight sparsity and filter pruning. However, these works only evaluate model components' static significance with internal parameter information, ignoring their dynamic interaction with external inputs. With per-input feature activation, the model component significance can dynamically change, and thus the static methods can only achieve sub-optimal results. Therefore, we propose a dynamic CNN optimization framework in this work. Based on the neural network attention mechanism, we propose a comprehensive dynamic optimization framework including (1) testing-phase channel and column feature map pruning, as well as (2) training-phase optimization by targeted dropout. Such a dynamic optimization framework has several benefits: (1) First, it can accurately identify and aggressively remove per-input feature redundancy with considering the model-input interaction; (2) Meanwhile, it can maximally remove the feature map redundancy in various dimensions thanks to the multi-dimension flexibility; (3) The training-testing co-optimization favors the dynamic pruning and helps maintain the model accuracy even with very high feature pruning ratio. Extensive experiments show that our method could bring 37.4 reduction with negligible accuracy drop on various of test networks.

READ FULL TEXT

page 1

page 4

page 6

research
09/13/2019

DASNet: Dynamic Activation Sparsity for Neural Network Efficiency Improvement

To improve the execution speed and efficiency of neural networks in embe...
research
10/29/2018

Demystifying Neural Network Filter Pruning

Based on filter magnitude ranking (e.g. L1 norm), conventional filter pr...
research
10/12/2018

Interpretable Convolutional Filter Pruning

The sophisticated structure of Convolutional Neural Network (CNN) allows...
research
07/30/2021

Manipulating Identical Filter Redundancy for Efficient Pruning on Deep and Complicated CNN

The existence of redundancy in Convolutional Neural Networks (CNNs) enab...
research
09/06/2018

2PFPCE: Two-Phase Filter Pruning Based on Conditional Entropy

Deep Convolutional Neural Networks (CNNs) offer remarkable performance o...
research
05/23/2019

Disentangling Redundancy for Multi-Task Pruning

Can prior network pruning strategies eliminate redundancy in multiple co...
research
12/31/2021

Multi-Dimensional Model Compression of Vision Transformer

Vision transformers (ViT) have recently attracted considerable attention...

Please sign up or login with your details

Forgot password? Click here to reset