Leveraging Product as an Activation Function in Deep Networks

10/19/2018
by   Luke B. Godfrey, et al.
0

Product unit neural networks (PUNNs) are powerful representational models with a strong theoretical basis, but have proven to be difficult to train with gradient-based optimizers. We present windowed product unit neural networks (WPUNNs), a simple method of leveraging product as a nonlinearity in a neural network. Windowing the product tames the complex gradient surface and enables WPUNNs to learn effectively, solving the problems faced by PUNNs. WPUNNs use product layers between traditional sum layers, capturing the representational power of product units and using the product itself as a nonlinearity. We find the result that this method works as well as traditional nonlinearities like ReLU on the MNIST dataset. We demonstrate that WPUNNs can also generalize gated units in recurrent neural networks, yielding results comparable to LSTM networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/11/2015

Improving neural networks with bunches of neurons modeled by Kumaraswamy units: Preliminary study

Deep neural networks have recently achieved state-of-the-art results in ...
research
04/23/2023

Improving Classification Neural Networks by using Absolute activation function (MNIST/LeNET-5 example)

The paper discusses the use of the Absolute activation function in class...
research
05/20/2019

Optimisation of Overparametrized Sum-Product Networks

It seems to be a pearl of conventional wisdom that parameter learning in...
research
08/09/2022

On the Activation Function Dependence of the Spectral Bias of Neural Networks

Neural networks are universal function approximators which are known to ...
research
06/01/2016

Improving Deep Neural Network with Multiple Parametric Exponential Linear Units

Activation function is crucial to the recent successes of deep neural ne...
research
02/06/2015

Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification

Rectified activation units (rectifiers) are essential for state-of-the-a...
research
11/10/2019

Symmetrical Gaussian Error Linear Units (SGELUs)

In this paper, a novel neural network activation function, called Symmet...

Please sign up or login with your details

Forgot password? Click here to reset