A Generalization Method of Partitioned Activation Function for Complex Number

02/08/2018
by   HyeonSeok Lee, et al.
0

A method to convert real number partitioned activation function into complex number one is provided. The method has 4em variations; 1 has potential to get holomorphic activation, 2 has potential to conserve complex angle, and the last 1 guarantees interaction between real and imaginary parts. The method has been applied to LReLU and SELU as examples. The complex number activation function is an building block of complex number ANN, which has potential to properly deal with complex number problems. But the complex activation is not well established yet. Therefore, we propose a way to extend the partitioned real activation to complex number.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2018

SBAF: A New Activation Function for Artificial Neural Net based Habitability Classification

We explore the efficacy of using a novel activation function in Artifici...
research
01/29/2022

On Polynomial Approximation of Activation Function

In this work, we propose an interesting method that aims to approximate ...
research
02/22/2018

Arbitrarily Substantial Number Representation for Complex Number

Researchers are often perplexed when their machine learning algorithms a...
research
01/15/2023

Empirical study of the modulus as activation function in computer vision applications

In this work we propose a new non-monotonic activation function: the mod...
research
11/07/2020

Universal Activation Function For Machine Learning

This article proposes a Universal Activation Function (UAF) that achieve...
research
06/26/2018

Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

Deep Convolutional Sparse Coding (D-CSC) is a framework reminiscent of d...

Please sign up or login with your details

Forgot password? Click here to reset