Data-Driven Learning of Feedforward Neural Networks with Different Activation Functions

07/04/2021
by   Grzegorz Dudek, et al.
0

This work contributes to the development of a new data-driven method (D-DM) of feedforward neural networks (FNNs) learning. This method was proposed recently as a way of improving randomized learning of FNNs by adjusting the network parameters to the target function fluctuations. The method employs logistic sigmoid activation functions for hidden nodes. In this study, we introduce other activation functions, such as bipolar sigmoid, sine function, saturating linear functions, reLU, and softplus. We derive formulas for their parameters, i.e. weights and biases. In the simulation study, we evaluate the performance of FNN data-driven learning with different activation functions. The results indicate that the sigmoid activation functions perform much better than others in the approximation of complex, fluctuated target functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2019

Improving Randomized Learning of Feedforward Neural Networks by Appropriate Generation of Random Parameters

In this work, a method of random parameters generation for randomized le...
research
06/20/2016

A New Training Method for Feedforward Neural Networks Based on Geometric Contraction Property of Activation Functions

We propose a new training method for a feedforward neural network having...
research
05/03/2018

Lifted Neural Networks

We describe a novel family of models of multi- layer feedforward neural ...
research
10/26/2021

Periodic Activation Functions Induce Stationarity

Neural network models are known to reinforce hidden data biases, making ...
research
07/08/2019

Copula Representations and Error Surface Projections for the Exclusive Or Problem

The exclusive or (xor) function is one of the simplest examples that ill...
research
12/06/2021

Data-driven forward-inverse problems for Yajima-Oikawa system using deep learning with parameter regularization

We investigate data-driven forward-inverse problems for Yajima-Oikawa (Y...
research
05/30/2021

Evolution of Activation Functions: An Empirical Investigation

The hyper-parameters of a neural network are traditionally designed thro...

Please sign up or login with your details

Forgot password? Click here to reset