Feed Forward and Backward Run in Deep Convolution Neural Network

11/09/2017
by   Pushparaja Murugan, et al.
0

Convolution Neural Networks (CNN), known as ConvNets are widely used in many visual imagery application, object classification, speech recognition. After the implementation and demonstration of the deep convolution neural network in Imagenet classification in 2012 by krizhevsky, the architecture of deep Convolution Neural Network is attracted many researchers. This has led to the major development in Deep learning frameworks such as Tensorflow, caffe, keras, theno. Though the implementation of deep learning is quite possible by employing deep learning frameworks, mathematical theory and concepts are harder to understand for new learners and practitioners. This article is intended to provide an overview of ConvNets architecture and to explain the mathematical theory behind it including activation function, loss function, feedforward and backward propagation. In this article, grey scale image is taken as input information image, ReLU and Sigmoid activation function are considered for developing the architecture and cross-entropy loss function are used for computing the difference between predicted value and actual value. The architecture is developed in such a way that it can contain one convolution layer, one pooling layer, and multiple dense layers

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/03/2018

Implementation of Deep Convolutional Neural Network in Multi-class Categorical Image Classification

Convolutional Neural Networks has been implemented in many complex machi...
research
08/10/2023

Optimizing Performance of Feedforward and Convolutional Neural Networks through Dynamic Activation Functions

Deep learning training training algorithms are a huge success in recent ...
research
07/17/2015

Learning Robust Deep Face Representation

With the development of convolution neural network, more and more resear...
research
09/22/2022

Nesting Forward Automatic Differentiation for Memory-Efficient Deep Neural Network Training

An activation function is an element-wise mathematical function and play...
research
07/31/2020

An Investigation on Deep Learning with Beta Stabilizer

Artificial neural networks (ANN) have been used in many applications suc...
research
03/24/2022

Steganalysis of Image with Adaptively Parametric Activation

Steganalysis as a method to detect whether image contains se-cret messag...
research
10/02/2022

Basic Binary Convolution Unit for Binarized Image Restoration Network

Lighter and faster image restoration (IR) models are crucial for the dep...

Please sign up or login with your details

Forgot password? Click here to reset