Adaptive Extreme Learning Machine for Recurrent Beta-basis Function Neural Network Training

10/31/2018
by   Naima Chouikhi, et al.
0

Beta Basis Function Neural Network (BBFNN) is a special kind of kernel basis neural networks. It is a feedforward network typified by the use of beta function as a hidden activation function. Beta is a flexible transfer function representing richer forms than the common existing functions. As in every network, the architecture setting as well as the learning method are two main gauntlets faced by BBFNN. In this paper, new architecture and training algorithm are proposed for the BBFNN. An Extreme Learning Machine (ELM) is used as a training approach of BBFNN with the aim of quickening the training process. The peculiarity of ELM is permitting a certain decrement of the computing time and complexity regarding the already used BBFNN learning algorithms such as backpropagation, OLS, etc. For the architectural design, a recurrent structure is added to the common BBFNN architecture in order to make it more able to deal with complex, non linear and time varying problems. Throughout this paper, the conceived recurrent ELM-trained BBFNN is tested on a number of tasks related to time series prediction, classification and regression. Experimental results show noticeable achievements of the proposed network compared to common feedforward and recurrent networks trained by ELM and using hyperbolic tangent as activation function. These achievements are in terms of accuracy and robustness against data breakdowns such as noise signals.

READ FULL TEXT

page 1

page 11

research
10/30/2012

Hierarchical Learning Algorithm for the Beta Basis Function Neural Network

The paper presents a two-level learning method for the design of the Bet...
research
06/25/2021

Ladder Polynomial Neural Networks

Polynomial functions have plenty of useful analytical properties, but th...
research
06/30/2022

Consensus Function from an L_p^q-norm Regularization Term for its Use as Adaptive Activation Functions in Neural Networks

The design of a neural network is usually carried out by defining the nu...
research
10/28/2019

Growing axons: greedy learning of neural networks with application to function approximation

We propose a new method for learning deep neural network models that is ...
research
05/17/2020

Separation of Memory and Processing in Dual Recurrent Neural Networks

We explore a neural network architecture that stacks a recurrent layer a...
research
09/25/2016

The RNN-ELM Classifier

In this paper we examine learning methods combining the Random Neural Ne...
research
02/16/2015

Exploring Transfer Function Nonlinearity in Echo State Networks

Supralinear and sublinear pre-synaptic and dendritic integration is cons...

Please sign up or login with your details

Forgot password? Click here to reset