A Specialized Evolutionary Strategy Using Mean Absolute Error Random Sampling to Design Recurrent Neural Networks

09/04/2019
by   Andrés Camero, et al.
0

Recurrent neural networks have demonstrated to be good at solving prediction problems. However, finding a network that suits a problem is quite hard because of their high sensitivity to the hyperparameter configuration. Automatic hyperparameter optimization methods help to find the most suitable configuration, but they are not extensively adopted because of their high computational cost. In this work, we study the use of the mean absolute error random sampling to compare multiple-hidden-layer architectures and propose an evolutionary strategy-based algorithm that uses its results to optimize the configuration of a recurrent network. We empirically validate our proposal and show that it is possible to predict and compare the expected performance of a hyperparameter configuration in a low-cost way, as well as use these predictions to optimize the configuration of a recurrent network.

READ FULL TEXT
research
05/18/2018

Low-Cost Recurrent Neural Network Expected Performance Evaluation

Recurrent neural networks are strong dynamic systems, but they are very ...
research
06/29/2021

Reliable and Fast Recurrent Neural Network Architecture Optimization

This article introduces Random Error Sampling-based Neuroevolution (RESN...
research
01/29/2020

Bayesian Neural Architecture Search using A Training-Free Performance Metric

Recurrent neural networks (RNNs) are a powerful approach for time series...
research
02/15/2021

Online hyperparameter optimization by real-time recurrent learning

Conventional hyperparameter optimization methods are computationally int...
research
06/11/2021

Automated Configuration of Genetic Algorithms by Tuning for Anytime Performance

Finding the best configuration of algorithms' hyperparameters for a give...
research
10/24/2019

Auto-Model: Utilizing Research Papers and HPO Techniques to Deal with the CASH problem

In many fields, a mass of algorithms with completely different hyperpara...

Please sign up or login with your details

Forgot password? Click here to reset