On the Exploitation of Neuroevolutionary Information: Analyzing the Past for a More Efficient Future

by   Unai Garciarena, et al.

Neuroevolutionary algorithms, automatic searches of neural network structures by means of evolutionary techniques, are computationally costly procedures. In spite of this, due to the great performance provided by the architectures which are found, these methods are widely applied. The final outcome of neuroevolutionary processes is the best structure found during the search, and the rest of the procedure is commonly omitted in the literature. However, a good amount of residual information consisting of valuable knowledge that can be extracted is also produced during these searches. In this paper, we propose an approach that extracts this information from neuroevolutionary runs, and use it to build a metamodel that could positively impact future neural architecture searches. More specifically, by inspecting the best structures found during neuroevolutionary searches of generative adversarial networks with varying characteristics (e.g., based on dense or convolutional layers), we propose a Bayesian network-based model which can be used to either find strong neural structures right away, conveniently initialize different structural searches for different problems, or help future optimization of structures of any type to keep finding increasingly better structures where uninformed methods get stuck into local optima.



There are no comments yet.


page 4

page 7


Inductive Transfer for Neural Architecture Optimization

The recent advent of automated neural network architecture search led to...

Neural Architecture Search with an Efficient Multiobjective Evolutionary Framework

Deep learning methods have become very successful at solving many comple...

Structural Health Monitoring Using Neural Network Based Vibrational System Identification

Composite fabrication technologies now provide the means for producing h...

Controlling Model Complexity in Probabilistic Model-Based Dynamic Optimization of Neural Network Structures

A method of simultaneously optimizing both the structure of neural netwo...

DENSER: Deep Evolutionary Network Structured Representation

Deep Evolutionary Network Structured Representation (DENSER) is a novel ...

Automated Search for Resource-Efficient Branched Multi-Task Networks

The multi-modal nature of many vision problems calls for neural network ...

Mathematical Models for Local Sensing Hashes

As data volumes continue to grow, searches in data are becoming increasi...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.