DeepAI AI Chat
Log In Sign Up

hyper-sinh: An Accurate and Reliable Function from Shallow to Deep Learning in TensorFlow and Keras

by   Luca Parisi, PhD, MBA, et al.

This paper presents the 'hyper-sinh', a variation of the m-arcsinh activation function suitable for Deep Learning (DL)-based algorithms for supervised learning, such as Convolutional Neural Networks (CNN). hyper-sinh, developed in the open source Python libraries TensorFlow and Keras, is thus described and validated as an accurate and reliable activation function for both shallow and deep neural networks. Improvements in accuracy and reliability in image and text classification tasks on five (N = 5) benchmark data sets available from Keras are discussed. Experimental results demonstrate the overall competitive classification performance of both shallow and deep neural networks, obtained via this novel function. This function is evaluated with respect to gold standard activation functions, demonstrating its overall competitive accuracy and reliability for both image and text classification.


page 1

page 2

page 3

page 4


m-arcsinh: An Efficient and Reliable Function for SVM and MLP in scikit-learn

This paper describes the 'm-arcsinh', a modified ('m-') version of the i...

ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

Despite the unresolved 'dying ReLU problem', the classical ReLU activati...

QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics

The ReLU activation function (AF) has been extensively applied in deep n...

Stochastic Adaptive Activation Function

The simulation of human neurons and neurotransmission mechanisms has bee...

Activation Functions: Comparison of trends in Practice and Research for Deep Learning

Deep neural networks have been successfully used in diverse emerging dom...

Integral representations of shallow neural network with Rectified Power Unit activation function

In this effort, we derive a formula for the integral representation of a...

Topology of deep neural networks

We study how the topology of a data set M = M_a ∪ M_b ⊆R^d, representing...