Multi-function Convolutional Neural Networks for Improving Image Classification Performance

05/30/2018
by   Luna M. Zhang, et al.
0

Traditional Convolutional Neural Networks (CNNs) typically use the same activation function (usually ReLU) for all neurons with non-linear mapping operations. For example, the deep convolutional architecture Inception-v4 uses ReLU. To improve the classification performance of traditional CNNs, a new "Multi-function Convolutional Neural Network" (MCNN) is created by using different activation functions for different neurons. For n neurons and m different activation functions, there are a total of m^n-m MCNNs and only m traditional CNNs. Therefore, the best model is very likely to be chosen from MCNNs because there are m^n-2m more MCNNs than traditional CNNs. For performance analysis, two different datasets for two applications (classifying handwritten digits from the MNIST database and classifying brain MRI images into one of the four stages of Alzheimer's disease (AD)) are used. For both applications, an activation function is randomly selected for each layer of a MCNN. For the AD diagnosis application, MCNNs using a newly created multi-function Inception-v4 architecture are constructed. Overall, simulations show that MCNNs can outperform traditional CNNs in terms of multi-class classification accuracy for both applications. An important future research work will be to efficiently select the best MCNN from m^n-m candidate MCNNs. Current CNN software only provides users with partial functionality of MCNNs since different layers can use different activation functions but not individual neurons in the same layer. Thus, modifying current CNN software systems such as ResNets, DenseNets, and Dual Path Networks by using multiple activation functions and developing more effective and faster MCNN software systems and tools would be very useful to solve difficult practical image classification problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2020

QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics

The ReLU activation function (AF) has been extensively applied in deep n...
research
06/24/2022

Evolution of Activation Functions for Deep Learning-Based Image Classification

Activation functions (AFs) play a pivotal role in the performance of neu...
research
10/23/2021

ConformalLayers: A non-linear sequential neural network with associative layers

Convolutional Neural Networks (CNNs) have been widely applied. But as th...
research
06/13/2021

Reborn Mechanism: Rethinking the Negative Phase Information Flow in Convolutional Neural Network

This paper proposes a novel nonlinear activation mechanism typically for...
research
01/28/2021

Reducing ReLU Count for Privacy-Preserving CNN Speedup

Privacy-Preserving Machine Learning algorithms must balance classificati...

Please sign up or login with your details

Forgot password? Click here to reset