Convolutional Neural Network with Pruning Method for Handwritten Digit Recognition

01/15/2021
by   Mengyu Chen, et al.
0

CNN model is a popular method for imagery analysis, so it could be utilized to recognize handwritten digits based on MNIST datasets. For higher recognition accuracy, various CNN models with different fully connected layer sizes are exploited to figure out the relationship between the CNN fully connected layer size and the recognition accuracy. Inspired by previous pruning work, we performed pruning methods of distinctiveness on CNN models and compared the pruning performance with NN models. For better pruning performances on CNN, the effect of angle threshold on the pruning performance was explored. The evaluation results show that: for the fully connected layer size, there is a threshold, so that when the layer size increases, the recognition accuracy grows if the layer size smaller than the threshold, and falls if the layer size larger than the threshold; the performance of pruning performed on CNN is worse than on NN; as pruning angle threshold increases, the fully connected layer size and the recognition accuracy decreases. This paper also shows that for CNN models trained by the MNIST dataset, they are capable of handwritten digit recognition and achieve the highest recognition accuracy with fully connected layer size 400. In addition, for same dataset MNIST, CNN models work better than big, deep, simple NN models in a published paper.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

07/22/2015

Data-free parameter pruning for Deep Neural Networks

Deep Neural nets (NNs) with millions of parameters are at the heart of m...
07/30/2021

Pruning Neural Networks with Interpolative Decompositions

We introduce a principled approach to neural network pruning that casts ...
04/04/2018

Building Efficient CNN Architecture for Offline Handwritten Chinese Character Recognition

Deep convolutional networks based methods have brought great breakthroug...
02/12/2018

ClosNets: a Priori Sparse Topologies for Faster DNN Training

Fully-connected layers in deep neural networks (DNN) are often the throu...
04/27/2021

Sifting out the features by pruning: Are convolutional networks the winning lottery ticket of fully connected ones?

Pruning methods can considerably reduce the size of artificial neural ne...
06/20/2019

Clustering and Classification Networks

In this paper, we will describe a network architecture that demonstrates...
01/22/2021

Baseline Pruning-Based Approach to Trojan Detection in Neural Networks

This paper addresses the problem of detecting trojans in neural networks...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.