The SWAG Algorithm; a Mathematical Approach that Outperforms Traditional Deep Learning. Theory and Implementation

11/28/2018
by   Saeid Safaei, et al.
0

The performance of artificial neural networks (ANNs) is influenced by weight initialization, the nature of activation functions, and their architecture. There is a wide range of activation functions that are traditionally used to train a neural network, e.g. sigmoid, tanh, and Rectified Linear Unit (ReLU). A widespread practice is to use the same type of activation function in all neurons in a given layer. In this manuscript, we present a type of neural network in which the activation functions in every layer form a polynomial basis; we name this method SWAG after the initials of the last names of the authors. We tested SWAG on three complex highly non-linear functions as well as the MNIST handwriting data set. SWAG outperforms and converges faster than the state of the art performance in fully connected neural networks. Given the low computational complexity of SWAG, and the fact that it was capable of solving problems current architectures cannot, it has the potential to change the way that we approach deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2022

APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

Activation Functions introduce non-linearity in the deep neural networks...
research
05/15/2015

Discontinuous Piecewise Polynomial Neural Networks

An artificial neural network is presented based on the idea of connectio...
research
06/03/2016

Dense Associative Memory for Pattern Recognition

A model of associative memory is studied, which stores and reliably retr...
research
07/18/2020

Abstraction based Output Range Analysis for Neural Networks

In this paper, we consider the problem of output range analysis for feed...
research
01/18/2021

Learning DNN networks using un-rectifying ReLU with compressed sensing application

The un-rectifying technique expresses a non-linear point-wise activation...
research
11/07/2016

Neural Networks Designing Neural Networks: Multi-Objective Hyper-Parameter Optimization

Artificial neural networks have gone through a recent rise in popularity...

Please sign up or login with your details

Forgot password? Click here to reset