RGP: Neural Network Pruning through Its Regular Graph Structure

10/28/2021
by   Zhuangzhi Chen, et al.
28

Lightweight model design has become an important direction in the application of deep learning technology, pruning is an effective mean to achieve a large reduction in model parameters and FLOPs. The existing neural network pruning methods mostly start from the importance of parameters, and design parameter evaluation metrics to perform parameter pruning iteratively. These methods are not studied from the perspective of model topology, may be effective but not efficient, and requires completely different pruning for different datasets. In this paper, we study the graph structure of the neural network, and propose regular graph based pruning (RGP) to perform a one-shot neural network pruning. We generate a regular graph, set the node degree value of the graph to meet the pruning ratio, and reduce the average shortest path length of the graph by swapping the edges to obtain the optimal edge distribution. Finally, the obtained graph is mapped into a neural network structure to realize pruning. Experiments show that the average shortest path length of the graph is negatively correlated with the classification accuracy of the corresponding neural network, and the proposed RGP shows a strong precision retention capability with extremely high parameter reduction (more than 90 reduction (more than 90

READ FULL TEXT

page 1

page 2

page 4

page 5

page 7

page 8

page 10

page 11

research
05/16/2023

Lower Bounds for Non-Adaptive Shortest Path Relaxation

We consider single-source shortest path algorithms that perform a sequen...
research
08/19/2019

A New Fast Weighted All-pairs Shortest Path Search Algorithm Based on Pruning by Shortest Path Trees

Recently we submitted a paper, whose title is A New Fast Unweighted All-...
research
01/10/2021

SPAGAN: Shortest Path Graph Attention Network

Graph convolutional networks (GCN) have recently demonstrated their pote...
research
11/19/2019

DARB: A Density-Aware Regular-Block Pruning for Deep Neural Networks

The rapidly growing parameter volume of deep neural networks (DNNs) hind...
research
07/13/2020

Graph Structure of Neural Networks

Neural networks are often represented as graphs of connections between n...
research
04/22/2022

Depth Pruning with Auxiliary Networks for TinyML

Pruning is a neural network optimization technique that sacrifices accur...
research
03/22/2021

SP Async:Single Source Shortest Path in Asynchronous Mode on MPI

Finding single source shortest path is a very ubiquitous problem. But wi...

Please sign up or login with your details

Forgot password? Click here to reset