Convolutional Neural Network with Pruning Method for Handwritten Digit Recognition

01/15/2021
by   Mengyu Chen, et al.
0

CNN model is a popular method for imagery analysis, so it could be utilized to recognize handwritten digits based on MNIST datasets. For higher recognition accuracy, various CNN models with different fully connected layer sizes are exploited to figure out the relationship between the CNN fully connected layer size and the recognition accuracy. Inspired by previous pruning work, we performed pruning methods of distinctiveness on CNN models and compared the pruning performance with NN models. For better pruning performances on CNN, the effect of angle threshold on the pruning performance was explored. The evaluation results show that: for the fully connected layer size, there is a threshold, so that when the layer size increases, the recognition accuracy grows if the layer size smaller than the threshold, and falls if the layer size larger than the threshold; the performance of pruning performed on CNN is worse than on NN; as pruning angle threshold increases, the fully connected layer size and the recognition accuracy decreases. This paper also shows that for CNN models trained by the MNIST dataset, they are capable of handwritten digit recognition and achieve the highest recognition accuracy with fully connected layer size 400. In addition, for same dataset MNIST, CNN models work better than big, deep, simple NN models in a published paper.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/22/2015

Data-free parameter pruning for Deep Neural Networks

Deep Neural nets (NNs) with millions of parameters are at the heart of m...
research
07/30/2021

Pruning Neural Networks with Interpolative Decompositions

We introduce a principled approach to neural network pruning that casts ...
research
04/04/2018

Building Efficient CNN Architecture for Offline Handwritten Chinese Character Recognition

Deep convolutional networks based methods have brought great breakthroug...
research
02/12/2018

ClosNets: a Priori Sparse Topologies for Faster DNN Training

Fully-connected layers in deep neural networks (DNN) are often the throu...
research
04/27/2021

Sifting out the features by pruning: Are convolutional networks the winning lottery ticket of fully connected ones?

Pruning methods can considerably reduce the size of artificial neural ne...
research
03/19/2020

Ensemble learning in CNN augmented with fully connected subnetworks

Convolutional Neural Networks (CNNs) have shown remarkable performance i...
research
07/07/2022

HE-PEx: Efficient Machine Learning under Homomorphic Encryption using Pruning, Permutation and Expansion

Privacy-preserving neural network (NN) inference solutions have recently...

Please sign up or login with your details

Forgot password? Click here to reset