Impact of Fully Connected Layers on Performance of Convolutional Neural Networks for Image Classification

01/21/2019
by   S H Shabbeer Basha, et al.
0

The Convolutional Neural Networks (CNNs), in domains like computer vision, mostly reduced the need for handcrafted features due to its ability to learn the problem-specific features from the raw input data. However, the selection of dataset-specific CNN architecture, which mostly performed by either experience or expertise is a time-consuming and error-prone process. To automate the process of learning a CNN architecture, this letter attempts at finding the relationship between Fully Connected (FC) layers with some of the characteristics of the datasets. The CNN architectures, and recently data sets also, are categorized as deep, shallow, wide, etc. This letter tries to formalize these terms along with answering the following questions. (i) What is the impact of deeper/shallow architectures on the performance of the CNN w.r.t FC layers?, (ii) How the deeper/wider datasets influence the performance of CNN w.r.t FC layers?, and (iii) Which kind of architecture (deeper/ shallower) is better suitable for which kind of (deeper/ wider) datasets. To address these findings, we have performed experiments with three CNN architectures having different depths. The experiments are conducted by varying the number of FC layers. We used four widely used datasets including CIFAR-10, CIFAR-100, Tiny ImageNet, and CRCHistoPhenotypes to justify our findings in the context of the image classification problem. The source code of this research is available at https://github.com/shabbeersh/Impact-of-FC-layers.

READ FULL TEXT
research
02/04/2021

A Deeper Look into Convolutions via Pruning

Convolutional neural networks (CNNs) are able to attain better visual re...
research
03/30/2018

Hierarchical Transfer Convolutional Neural Networks for Image Classification

In this paper, we address the issue of how to enhance the generalization...
research
03/16/2021

Is it Enough to Optimize CNN Architectures on ImageNet?

An implicit but pervasive hypothesis of modern computer vision research ...
research
10/06/2020

How Convolutional Neural Network Architecture Biases Learned Opponency and Colour Tuning

Recent work suggests that changing Convolutional Neural Network (CNN) ar...
research
10/11/2022

The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes

Convolutional neural networks were the standard for solving many compute...
research
05/06/2015

A Deeper Look at Dataset Bias

The presence of a bias in each image data collection has recently attrac...
research
01/22/2020

AutoFCL: Automatically Tuning Fully Connected Layers for Transfer Learning

Deep Convolutional Neural Networks (CNN) have evolved as popular machine...

Please sign up or login with your details

Forgot password? Click here to reset