Self Organizing Classifiers: First Steps in Structured Evolutionary Machine Learning

Learning classifier systems (LCSs) are evolutionary machine learning algorithms, flexible enough to be applied to reinforcement, supervised and unsupervised learning problems with good performance. Recently, self organizing classifiers were proposed which are similar to LCSs but have the advantage that in its structured population no balance between niching and fitness pressure is necessary. However, more tests and analysis are required to verify its benefits. Here, a variation of the first algorithm is proposed which uses a parameterless self organizing map (SOM). This algorithm is applied in challenging problems such as big, noisy as well as dynamically changing continuous input-action mazes (growing and compressing mazes are included) with good performance. Moreover, a genetic operator is proposed which utilizes the topological information of the SOM's population structure, improving the results. Thus, the first steps in structured evolutionary machine learning are shown, nonetheless, the problems faced are more difficult than the state-of-art continuous input-action multi-step ones.

READ FULL TEXT

page 10

page 11

page 12

research
11/20/2018

Self Organizing Classifiers and Niched Fitness

Learning classifier systems are adaptive learning systems which have bee...
research
05/22/2016

Hybrid evolutionary algorithm with extreme machine learning fitness function evaluation for two-stage capacitated facility location problem

This paper considers the two-stage capacitated facility location problem...
research
03/31/2017

On Self-Adaptive Mutation Restarts for Evolutionary Robotics with Real Rotorcraft

Self-adaptive parameters are increasingly used in the field of Evolution...
research
09/19/2018

Novelty-organizing team of classifiers in noisy and dynamic environments

In the real world, the environment is constantly changing with the input...
research
03/28/2018

Supervising Unsupervised Learning with Evolutionary Algorithm in Deep Neural Network

A method to control results of gradient descent unsupervised learning in...
research
04/15/2019

The 1/5-th Rule with Rollbacks: On Self-Adjustment of the Population Size in the (1+(λ,λ)) GA

Self-adjustment of parameters can significantly improve the performance ...
research
06/14/2018

From Self-ception to Image Self-ception: A method to represent an image with its own approximations

A concept of defining images based on its own approximate ones is propos...

Please sign up or login with your details

Forgot password? Click here to reset