Global convergence of Negative Correlation Extreme Learning Machine

09/30/2020
by   Carlos Perales-González, et al.
0

Ensemble approaches introduced in the Extreme Learning Machine (ELM) literature mainly come from methods that relies on data sampling procedures, under the assumption that the training data are heterogeneously enough to set up diverse base learners. To overcome this assumption, it was proposed an ELM ensemble method based on the Negative Correlation Learning (NCL) framework, called Negative Correlation Extreme Learning Machine (NCELM). This model works in two stages: i) different ELMs are generated as base learners with random weights in the hidden layer, and ii) a NCL penalty term with the information of the ensemble prediction is introduced in each ELM minimization problem, updating the base learners, iii) second step is iterated until the ensemble converges. Although this NCL ensemble method was validated by an experimental study with multiple benchmark datasets, no information was given on the conditions about this convergence. This paper mathematically presents the sufficient conditions to guarantee the global convergence of NCELM. The update of the ensemble in each iteration is defined as a contraction mapping function, and through Banach theorem, global convergence of the ensemble is proved.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro