A continuum among logarithmic, linear, and exponential functions, and its potential to improve generalization in neural networks

02/03/2016
by   Luke B. Godfrey, et al.
0

We present the soft exponential activation function for artificial neural networks that continuously interpolates between logarithmic, linear, and exponential functions. This activation function is simple, differentiable, and parameterized so that it can be trained as the rest of the network is trained. We hypothesize that soft exponential has the potential to improve neural network learning, as it can exactly calculate many natural operations that typical neural networks can only approximate, including addition, multiplication, inner product, distance, polynomials, and sinusoids.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2018

SBAF: A New Activation Function for Artificial Neural Net based Habitability Classification

We explore the efficacy of using a novel activation function in Artifici...
research
07/23/2020

Nonclosedness of the Set of Neural Networks in Sobolev Space

We examine the closedness of the set of realized neural networks of a fi...
research
02/27/2023

Moderate Adaptive Linear Units (MoLU)

We propose a new high-performance activation function, Moderate Adaptive...
research
01/01/2019

Dense Morphological Network: An Universal Function Approximator

Artificial neural networks are built on the basic operation of linear co...
research
06/20/2018

Log-sum-exp neural networks and posynomial models for convex and log-log-convex data

We show that a one-layer feedforward neural network with exponential act...
research
08/28/2017

A parameterized activation function for learning fuzzy logic operations in deep neural networks

We present a deep learning architecture for learning fuzzy logic express...

Please sign up or login with your details

Forgot password? Click here to reset