Modular Grammatical Evolution for the Generation of Artificial Neural Networks

08/04/2022
by   Khabat Soltanian, et al.
0

This paper presents a novel method, called Modular Grammatical Evolution (MGE), towards validating the hypothesis that restricting the solution space of NeuroEvolution to modular and simple neural networks enables the efficient generation of smaller and more structured neural networks while providing acceptable (and in some cases superior) accuracy on large data sets. MGE also enhances the state-of-the-art Grammatical Evolution (GE) methods in two directions. First, MGE's representation is modular in that each individual has a set of genes, and each gene is mapped to a neuron by grammatical rules. Second, the proposed representation mitigates two important drawbacks of GE, namely the low scalability and weak locality of representation, towards generating modular and multi-layer networks with a high number of neurons. We define and evaluate five different forms of structures with and without modularity using MGE and find single-layer modules with no coupling more productive. Our experiments demonstrate that modularity helps in finding better neural networks faster. We have validated the proposed method using ten well-known classification benchmarks with different sizes, feature counts, and output class count. Our experimental results indicate that MGE provides superior accuracy with respect to existing NeuroEvolution methods and returns classifiers that are significantly simpler than other machine learning generated classifiers. Finally, we empirically demonstrate that MGE outperforms other GE methods in terms of locality and scalability properties.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/26/2017

Towards the Evolution of Multi-Layered Neural Networks: A Dynamic Structured Grammatical Evolution Approach

Current grammar-based NeuroEvolution approaches have several shortcoming...
research
03/10/2020

Neural Networks are Surprisingly Modular

The learned weights of a neural network are often considered devoid of s...
research
05/04/2023

Seeing is Believing: Brain-Inspired Modular Training for Mechanistic Interpretability

We introduce Brain-Inspired Modular Training (BIMT), a method for making...
research
02/15/2019

Operational Neural Networks

Feed-forward, fully-connected Artificial Neural Networks (ANNs) or the s...
research
07/17/2023

Modular Neural Network Approaches for Surgical Image Recognition

Deep learning-based applications have seen a lot of success in recent ye...
research
04/01/2020

Incremental Evolution and Development of Deep Artificial Neural Networks

NeuroEvolution (NE) methods are known for applying Evolutionary Computat...
research
03/22/2022

Clustering units in neural networks: upstream vs downstream information

It has been hypothesized that some form of "modular" structure in artifi...

Please sign up or login with your details

Forgot password? Click here to reset