Effective Network Compression Using Simulation-Guided Iterative Pruning

02/12/2019
by   Dae-Woong Jeong, et al.
0

Existing high-performance deep learning models require very intensive computing. For this reason, it is difficult to embed a deep learning model into a system with limited resources. In this paper, we propose the novel idea of the network compression as a method to solve this limitation. The principle of this idea is to make iterative pruning more effective and sophisticated by simulating the reduced network. A simple experiment was conducted to evaluate the method; the results showed that the proposed method achieved higher performance than existing methods at the same pruning level.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2022

Iterative Activation-based Structured Pruning

Deploying complex deep learning models on edge devices is challenging be...
research
06/07/2022

Neural Network Compression via Effective Filter Analysis and Hierarchical Pruning

Network compression is crucial to making the deep networks to be more ef...
research
01/30/2022

Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning

With the remarkable success of deep learning recently, efficient network...
research
07/07/2023

Distilled Pruning: Using Synthetic Data to Win the Lottery

This work introduces a novel approach to pruning deep learning models by...
research
12/02/2022

CLIP: Train Faster with Less Data

Deep learning models require an enormous amount of data for training. Ho...
research
05/17/2022

Dimensionality Reduced Training by Pruning and Freezing Parts of a Deep Neural Network, a Survey

State-of-the-art deep learning models have a parameter count that reache...
research
08/16/2016

Dynamic Network Surgery for Efficient DNNs

Deep learning has become a ubiquitous technology to improve machine inte...

Please sign up or login with your details

Forgot password? Click here to reset