Hierarchical Learning Algorithm for the Beta Basis Function Neural Network

10/30/2012
by   Habib Dhahri, et al.
0

The paper presents a two-level learning method for the design of the Beta Basis Function Neural Network BBFNN. A Genetic Algorithm is employed at the upper level to construct BBFNN, while the key learning parameters :the width, the centers and the Beta form are optimised using the gradient algorithm at the lower level. In order to demonstrate the effectiveness of this hierarchical learning algorithm HLABBFNN, we need to validate our algorithm for the approximation of non-linear function.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2018

Adaptive Extreme Learning Machine for Recurrent Beta-basis Function Neural Network Training

Beta Basis Function Neural Network (BBFNN) is a special kind of kernel b...
research
05/01/2023

On ordered beta distribution and the generalized incomplete beta function

Motivated by applications in Bayesian analysis we introduce a multidimen...
research
10/28/2019

Growing axons: greedy learning of neural networks with application to function approximation

We propose a new method for learning deep neural network models that is ...
research
10/26/2021

Beta Shapley: a Unified and Noise-reduced Data Valuation Framework for Machine Learning

Data Shapley has recently been proposed as a principled framework to qua...
research
06/14/2013

Generalized Beta Divergence

This paper generalizes beta divergence beyond its classical form associa...
research
04/05/2014

SSS* = Alpha-Beta + TT

In 1979 Stockman introduced the SSS* minimax search algorithm that domi-...
research
09/21/2015

Estimating Random Delays in Modbus Network Using Experiments and General Linear Regression Neural Networks with Genetic Algorithm Smoothing

Time-varying delays adversely affect the performance of networked contro...

Please sign up or login with your details

Forgot password? Click here to reset