MorphoActivation: Generalizing ReLU activation function by mathematical morphology

07/13/2022
by   Santiago Velasco-Forero, et al.
0

This paper analyses both nonlinear activation functions and spatial max-pooling for Deep Convolutional Neural Networks (DCNNs) by means of the algebraic basis of mathematical morphology. Additionally, a general family of activation functions is proposed by considering both max-pooling and nonlinear operators in the context of morphological representations. Experimental section validates the goodness of our approach on classical benchmarks for supervised learning by DCNN.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2022

Convergence of Deep Neural Networks with General Activation Functions and Pooling

Deep neural networks, as a powerful system to represent high dimensional...
research
09/28/2020

EIS – a family of activation functions combining Exponential, ISRU, and Softplus

Activation functions play a pivotal role in the function learning using ...
research
11/25/2022

MorphPool: Efficient Non-linear Pooling Unpooling in CNNs

Pooling is essentially an operation from the field of Mathematical Morph...
research
07/12/2019

Sparsely Activated Networks

Previous literature on unsupervised learning focused on designing struct...
research
06/26/2018

Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

Deep Convolutional Sparse Coding (D-CSC) is a framework reminiscent of d...
research
06/13/2023

Safe Use of Neural Networks

Neural networks in modern communication systems can be susceptible to in...
research
07/04/2019

Neural Networks, Hypersurfaces, and Radon Transforms

Connections between integration along hypersufaces, Radon transforms, an...

Please sign up or login with your details

Forgot password? Click here to reset