Improving Randomized Learning of Feedforward Neural Networks by Appropriate Generation of Random Parameters

08/15/2019
by   Grzegorz Dudek, et al.
0

In this work, a method of random parameters generation for randomized learning of a single-hidden-layer feedforward neural network is proposed. The method firstly, randomly selects the slope angles of the hidden neurons activation functions from an interval adjusted to the target function, then randomly rotates the activation functions, and finally distributes them across the input space. For complex target functions the proposed method gives better results than the approach commonly used in practice, where the random parameters are selected from the fixed interval. This is because it introduces the steepest fragments of the activation functions into the input hypercube, avoiding their saturation fragments.

READ FULL TEXT
research
07/04/2021

Data-Driven Learning of Feedforward Neural Networks with Different Activation Functions

This work contributes to the development of a new data-driven method (D-...
research
08/11/2019

Data-Driven Randomized Learning of Feedforward Neural Networks

Randomized methods of neural network learning suffer from a problem with...
research
06/03/2016

Dense Associative Memory for Pattern Recognition

A model of associative memory is studied, which stores and reliably retr...
research
09/04/2019

A Constructive Approach for Data-Driven Randomized Learning of Feedforward Neural Networks

Feedforward neural networks with random hidden nodes suffer from a probl...
research
03/28/2020

Memorizing Gaussians with no over-parameterizaion via gradient decent on neural networks

We prove that a single step of gradient decent over depth two network, w...
research
10/06/2003

On Interference of Signals and Generalization in Feedforward Neural Networks

This paper studies how the generalization ability of neurons can be affe...

Please sign up or login with your details

Forgot password? Click here to reset