Dynamic Network Surgery for Efficient DNNs

08/16/2016
by   Yiwen Guo, et al.
0

Deep learning has become a ubiquitous technology to improve machine intelligence. However, most of the existing deep models are structurally very complex, making them difficult to be deployed on the mobile platforms with limited computational power. In this paper, we propose a novel network compression method called dynamic network surgery, which can remarkably reduce the network complexity by making on-the-fly connection pruning. Unlike the previous methods which accomplish this task in a greedy way, we properly incorporate connection splicing into the whole process to avoid incorrect pruning and make it as a continual network maintenance. The effectiveness of our method is proved with experiments. Without any accuracy loss, our method can efficiently compress the number of parameters in LeNet-5 and AlexNet by a factor of 108× and 17.7× respectively, proving that it outperforms the recent pruning method by considerable margins. Code and some models are available at https://github.com/yiwenguo/Dynamic-Network-Surgery.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2022

Network Pruning via Feature Shift Minimization

Channel pruning is widely used to reduce the complexity of deep network ...
research
04/26/2022

Attentive Fine-Grained Structured Sparsity for Image Restoration

Image restoration tasks have witnessed great performance improvement in ...
research
08/21/2023

Efficient Joint Optimization of Layer-Adaptive Weight Pruning in Deep Neural Networks

In this paper, we propose a novel layer-adaptive weight-pruning approach...
research
04/05/2021

Branch-and-Pruning Optimization Towards Global Optimality in Deep Learning

It has been attracting more and more attention to understand the global ...
research
02/12/2019

Effective Network Compression Using Simulation-Guided Iterative Pruning

Existing high-performance deep learning models require very intensive co...
research
07/22/2022

FairGRAPE: Fairness-aware GRAdient Pruning mEthod for Face Attribute Classification

Existing pruning techniques preserve deep neural networks' overall abili...
research
05/10/2020

Compact Neural Representation Using Attentive Network Pruning

Deep neural networks have evolved to become power demanding and conseque...

Please sign up or login with your details

Forgot password? Click here to reset