On Polynomial Approximation of Activation Function

01/29/2022
by   John Chiang, et al.
0

In this work, we propose an interesting method that aims to approximate an activation function over some domain by polynomials of the presupposing low degree. The main idea behind this method can be seen as an extension of the ordinary least square method and includes the gradient of activation function into the cost function to minimize.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/15/2023

Empirical study of the modulus as activation function in computer vision applications

In this work we propose a new non-monotonic activation function: the mod...
research
09/05/2020

Binary Classification as a Phase Separation Process

We propose a new binary classification model called Phase Separation Bin...
research
02/08/2018

A Generalization Method of Partitioned Activation Function for Complex Number

A method to convert real number partitioned activation function into com...
research
12/22/2021

Squareplus: A Softplus-Like Algebraic Rectifier

We present squareplus, an activation function that resembles softplus, b...
research
10/28/2021

Conditional Inference and Activation of Knowledge Entities in ACT-R

Activation-based conditional inference applies conditional reasoning to ...
research
06/26/2018

Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

Deep Convolutional Sparse Coding (D-CSC) is a framework reminiscent of d...
research
11/23/2019

Oscillator Circuit for Spike Neural Network with Sigmoid Like Activation Function and Firing Rate Coding

The study presents an oscillator circuit for a spike neural network with...

Please sign up or login with your details

Forgot password? Click here to reset