Using CODEQ to Train Feed-forward Neural Networks

02/03/2010
by   Mahamed G. H. Omran, et al.
0

CODEQ is a new, population-based meta-heuristic algorithm that is a hybrid of concepts from chaotic search, opposition-based learning, differential evolution and quantum mechanics. CODEQ has successfully been used to solve different types of problems (e.g. constrained, integer-programming, engineering) with excellent results. In this paper, CODEQ is used to train feed-forward neural networks. The proposed method is compared with particle swarm optimization and differential evolution algorithms on three data sets with encouraging results.

READ FULL TEXT
research
06/23/2012

Analysis of a Nature Inspired Firefly Algorithm based Back-propagation Neural Network Training

Optimization algorithms are normally influenced by meta-heuristic approa...
research
02/27/2015

Norm-Based Capacity Control in Neural Networks

We investigate the capacity, convexity and characterization of a general...
research
04/27/2015

Optimal Convergence Rate in Feed Forward Neural Networks using HJB Equation

A control theoretic approach is presented in this paper for both batch a...
research
09/12/2012

Training a Feed-forward Neural Network with Artificial Bee Colony Based Backpropagation Method

Back-propagation algorithm is one of the most widely used and popular te...
research
06/10/2022

An application of neural networks to a problem in knot theory and group theory (untangling braids)

We report on our success on solving the problem of untangling braids up ...
research
02/21/2023

On the Behaviour of Pulsed Qubits and their Application to Feed Forward Networks

In the last two decades, the combination of machine learning and quantum...
research
02/19/2002

On model selection and the disability of neural networks to decompose tasks

A neural network with fixed topology can be regarded as a parametrizatio...

Please sign up or login with your details

Forgot password? Click here to reset