Win the Lottery Ticket via Fourier Analysis: Frequencies Guided Network Pruning

01/30/2022
by   Yuzhang Shang, et al.
24

With the remarkable success of deep learning recently, efficient network compression algorithms are urgently demanded for releasing the potential computational power of edge devices, such as smartphones or tablets. However, optimal network pruning is a non-trivial task which mathematically is an NP-hard problem. Previous researchers explain training a pruned network as buying a lottery ticket. In this paper, we investigate the Magnitude-Based Pruning (MBP) scheme and analyze it from a novel perspective through Fourier analysis on the deep learning model to guide model designation. Besides explaining the generalization ability of MBP using Fourier transform, we also propose a novel two-stage pruning approach, where one stage is to obtain the topological structure of the pruned network and the other stage is to retrain the pruned network to recover the capacity using knowledge distillation from lower to higher on the frequency domain. Extensive experiments on CIFAR-10 and CIFAR-100 demonstrate the superiority of our novel Fourier analysis based MBP compared to other traditional MBP algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2019

Really should we pruning after model be totally trained? Pruning based on a small amount of training

Pre-training of models in pruning algorithms plays an important role in ...
research
02/12/2019

Effective Network Compression Using Simulation-Guided Iterative Pruning

Existing high-performance deep learning models require very intensive co...
research
07/07/2023

Distilled Pruning: Using Synthetic Data to Win the Lottery

This work introduces a novel approach to pruning deep learning models by...
research
06/02/2023

Group channel pruning and spatial attention distilling for object detection

Due to the over-parameterization of neural networks, many model compress...
research
05/07/2022

Automatic Block-wise Pruning with Auxiliary Gating Structures for Deep Convolutional Neural Networks

Convolutional neural networks are prevailing in deep learning tasks. How...
research
12/02/2020

An Once-for-All Budgeted Pruning Framework for ConvNets Considering Input Resolution

We propose an efficient once-for-all budgeted pruning framework (OFARPru...
research
02/21/2022

A Novel Architecture Slimming Method for Network Pruning and Knowledge Distillation

Network pruning and knowledge distillation are two widely-known model co...

Please sign up or login with your details

Forgot password? Click here to reset