Comparisons among different stochastic selection of activation layers for convolutional neural networks for healthcare

11/24/2020
by   Loris Nanni, et al.
0

Classification of biological images is an important task with crucial application in many fields, such as cell phenotypes recognition, detection of cell organelles and histopathological classification, and it might help in early medical diagnosis, allowing automatic disease classification without the need of a human expert. In this paper we classify biomedical images using ensembles of neural networks. We create this ensemble using a ResNet50 architecture and modifying its activation layers by substituting ReLUs with other functions. We select our activations among the following ones: ReLU, leaky ReLU, Parametric ReLU, ELU, Adaptive Piecewice Linear Unit, S-Shaped ReLU, Swish , Mish, Mexican Linear Unit, Gaussian Linear Unit, Parametric Deformable Linear Unit, Soft Root Sign (SRS) and others. As a baseline, we used an ensemble of neural networks that only use ReLU activations. We tested our networks on several small and medium sized biomedical image datasets. Our results prove that our best ensemble obtains a better performance than the ones of the naive approaches. In order to encourage the reproducibility of this work, the MATLAB code of all the experiments will be shared at https://github.com/LorisNanni.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2018

E-swish: Adjusting Activations to Different Network Depths

Activation functions have a notorious impact on neural networks on both ...
research
05/07/2019

Ensemble of Convolutional Neural Networks Trained with Different Activation Functions

Activation functions play a vital role in the training of Convolutional ...
research
05/05/2015

Empirical Evaluation of Rectified Activations in Convolutional Network

In this paper we investigate the performance of different types of recti...
research
10/20/2020

Smooth activations and reproducibility in deep networks

Deep networks are gradually penetrating almost every domain in our lives...
research
02/14/2022

Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations

Real world recommendation systems influence a constantly growing set of ...
research
01/14/2021

Neural networks behave as hash encoders: An empirical study

The input space of a neural network with ReLU-like activations is partit...
research
09/19/2017

Training Better CNNs Requires to Rethink ReLU

With the rapid development of Deep Convolutional Neural Networks (DCNNs)...

Please sign up or login with your details

Forgot password? Click here to reset