Squareplus: A Softplus-Like Algebraic Rectifier

12/22/2021
by   Jonathan T. Barron, et al.
0

We present squareplus, an activation function that resembles softplus, but which can be computed using only algebraic operations: addition, multiplication, and square-root. Because squareplus is  6x faster to evaluate than softplus on a CPU and does not require access to transcendental functions, it may have practical value in resource-limited deep learning applications.

READ FULL TEXT

page 1

page 2

page 3

research
08/09/2023

TSSR: A Truncated and Signed Square Root Activation Function for Neural Networks

Activation functions are essential components of neural networks. In thi...
research
01/29/2022

On Polynomial Approximation of Activation Function

In this work, we propose an interesting method that aims to approximate ...
research
01/31/2022

Low-Rank Updates of Matrix Square Roots

Models in which the covariance matrix has the structure of a sparse matr...
research
05/20/2023

GELU Activation Function in Deep Learning: A Comprehensive Mathematical Analysis and Performance

Selecting the most suitable activation function is a critical factor in ...
research
04/03/2023

TransPimLib: A Library for Efficient Transcendental Functions on Processing-in-Memory Systems

Processing-in-memory (PIM) promises to alleviate the data movement bottl...
research
11/07/2020

Universal Activation Function For Machine Learning

This article proposes a Universal Activation Function (UAF) that achieve...
research
01/02/2020

Expand-and-Randomize: An Algebraic Approach to Secure Computation

We consider the secure computation problem in a minimal model, where Ali...

Please sign up or login with your details

Forgot password? Click here to reset