PRUNIX: Non-Ideality Aware Convolutional Neural Network Pruning for Memristive Accelerators

02/03/2022
by   Ali Alshaarawy, et al.
0

In this work, PRUNIX, a framework for training and pruning convolutional neural networks is proposed for deployment on memristor crossbar based accelerators. PRUNIX takes into account the numerous non-ideal effects of memristor crossbars including weight quantization, state-drift, aging and stuck-at-faults. PRUNIX utilises a novel Group Sawtooth Regularization intended to improve non-ideality tolerance as well as sparsity, and a novel Adaptive Pruning Algorithm (APA) intended to minimise accuracy loss by considering the sensitivity of different layers of a CNN to pruning. We compare our regularization and pruning methods with other standards on multiple CNN architectures, and observe an improvement of 13 quantization and other non-ideal effects are accounted for with an overall sparsity of 85

READ FULL TEXT
research
12/28/2021

Speedup deep learning models on GPU by taking advantage of efficient unstructured pruning and bit-width reduction

This work is focused on the pruning of some convolutional neural network...
research
08/29/2019

Smaller Models, Better Generalization

Reducing network complexity has been a major research focus in recent ye...
research
04/26/2018

Accelerator-Aware Pruning for Convolutional Neural Networks

Convolutional neural networks have shown tremendous performance in compu...
research
02/20/2020

Performance Aware Convolutional Neural Network Channel Pruning for Embedded GPUs

Convolutional Neural Networks (CNN) are becoming a common presence in ma...
research
09/10/2019

VACL: Variance-Aware Cross-Layer Regularization for Pruning Deep Residual Networks

Improving weight sparsity is a common strategy for producing light-weigh...
research
11/19/2018

Three Dimensional Convolutional Neural Network Pruning with Regularization-Based Method

In recent years, three-dimensional convolutional neural network (3D CNN)...
research
06/22/2020

Exploiting Weight Redundancy in CNNs: Beyond Pruning and Quantization

Pruning and quantization are proven methods for improving the performanc...

Please sign up or login with your details

Forgot password? Click here to reset