Comparison of different convolutional neural network activation functions and methods for building ensembles

03/29/2021
by   Loris Nanni, et al.
0

Recently, much attention has been devoted to finding highly efficient and powerful activation functions for CNN layers. Because activation functions inject different nonlinearities between layers that affect performance, varying them is one method for building robust ensembles of CNNs. The objective of this study is to examine the performance of CNN ensembles made with different activation functions, including six new ones presented here: 2D Mexican ReLU, TanELU, MeLU+GaLU, Symmetric MeLU, Symmetric GaLU, and Flexible MeLU. The highest performing ensemble was built with CNNs having different activation layers that randomly replaced the standard ReLU. A comprehensive evaluation of the proposed approach was conducted across fifteen biomedical data sets representing various classification tasks. The proposed method was tested on two basic CNN architectures: Vgg16 and ResNet50. Results demonstrate the superiority in performance of this approach. The MATLAB source code for this study will be available at https://github.com/LorisNanni.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2019

Ensemble of Convolutional Neural Networks Trained with Different Activation Functions

Activation functions play a vital role in the training of Convolutional ...
research
12/17/2021

Adaptively Customizing Activation Functions for Various Layers

To enhance the nonlinearity of neural networks and increase their mappin...
research
03/16/2016

Suppressing the Unusual: towards Robust CNNs using Symmetric Activation Functions

Many deep Convolutional Neural Networks (CNN) make incorrect predictions...
research
04/25/2020

Compromise-free Bayesian neural networks

We conduct a thorough analysis of the relationship between the out-of-sa...
research
02/07/2020

Translating Diffusion, Wavelets, and Regularisation into Residual Networks

Convolutional neural networks (CNNs) often perform well, but their stabi...
research
09/29/2021

A Comprehensive Survey and Performance Analysis of Activation Functions in Deep Learning

Neural networks have shown tremendous growth in recent years to solve nu...

Please sign up or login with your details

Forgot password? Click here to reset