Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition

03/01/2010
by   Dan Claudiu Ciresan, et al.
0

Good old on-line back-propagation for plain multi-layer perceptrons yields a very low 0.35 we need to achieve this best result so far are many hidden layers, many neurons per layer, numerous deformed training images, and graphics cards to greatly speed up learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2015

Massively Deep Artificial Neural Networks for Handwritten Digit Recognition

Greedy Restrictive Boltzmann Machines yield an fairly low 0.72 the famou...
research
03/23/2011

Handwritten Digit Recognition with a Committee of Deep Neural Nets on GPUs

The competitive MNIST handwritten digit recognition benchmark has a long...
research
06/12/2013

Understanding Dropout: Training Multi-Layer Perceptrons with Auxiliary Independent Stochastic Neurons

In this paper, a simple, general method of adding auxiliary stochastic n...
research
04/24/2019

Layer Dynamics of Linearised Neural Nets

Despite the phenomenal success of deep learning in recent years, there r...
research
07/04/2021

Multi-layer Hebbian networks with modern deep learning frameworks

Deep learning networks generally use non-biological learning methods. By...
research
12/14/2018

On Stacked Denoising Autoencoder based Pre-training of ANN for Isolated Handwritten Bengali Numerals Dataset Recognition

This work attempts to find the most optimal parameter setting of a deep ...
research
02/19/2021

Training cascaded networks for speeded decisions using a temporal-difference loss

Although deep feedforward neural networks share some characteristics wit...

Please sign up or login with your details

Forgot password? Click here to reset