Sequential Encryption of Sparse Neural Networks Toward Optimum Representation of Irregular Sparsity

05/05/2021
by   Baeseong Park, et al.
0

Even though fine-grained pruning techniques achieve a high compression ratio, conventional sparsity representations (such as CSR) associated with irregular sparsity degrade parallelism significantly. Practical pruning methods, thus, usually lower pruning rates (by structured pruning) to improve parallelism. In this paper, we study fixed-to-fixed (lossless) encryption architecture/algorithm to support fine-grained pruning methods such that sparse neural networks can be stored in a highly regular structure. We first estimate the maximum compression ratio of encryption-based compression using entropy. Then, as an effort to push the compression ratio to the theoretical maximum (by entropy), we propose a sequential fixed-to-fixed encryption scheme. We demonstrate that our proposed compression scheme achieves almost the maximum compression ratio for the Transformer and ResNet-50 pruned by various fine-grained pruning methods.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 16

05/14/2019

Network Pruning for Low-Rank Binary Indexing

Pruning is an efficient model compression technique to remove redundancy...
01/17/2021

Network Automatic Pruning: Start NAP and Take a Nap

Network pruning can significantly reduce the computation and memory foot...
02/11/2020

PCNN: Pattern-based Fine-Grained Regular Pruning towards Optimizing CNN Accelerators

Weight pruning is a powerful technique to realize model compression. We ...
04/03/2021

Tight Compression: Compressing CNN Through Fine-Grained Pruning and Weight Permutation for Efficient Implementation

The unstructured sparsity after pruning poses a challenge to the efficie...
07/27/2021

Experiments on Properties of Hidden Structures of Sparse Neural Networks

Sparsity in the structure of Neural Networks can lead to less energy con...
11/01/2018

Balanced Sparsity for Efficient DNN Inference on GPU

In trained deep neural networks, unstructured pruning can reduce redunda...
02/08/2021

Learning N:M Fine-grained Structured Sparse Neural Networks From Scratch

Sparsity in Deep Neural Networks (DNNs) has been widely studied to compr...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.