A Scalable Continuous Unbounded Optimisation Benchmark Suite from Neural Network Regression

09/12/2021
by   Katherine M. Malan, et al.
5

For the design of optimisation algorithms that perform well in general, it is necessary to experiment with and benchmark on a range of problems with diverse characteristics. The training of neural networks is an optimisation task that has gained prominence with the recent successes of deep learning. Although evolutionary algorithms have been used for training neural networks, gradient descent variants are by far the most common choice with their trusted good performance on large-scale machine learning tasks. With this paper we contribute CORNN (Continuous Optimisation of Regression tasks using Neural Networks), a large suite that can easily be used to benchmark the performance of any continuous black-box algorithm on neural network training problems. By employing different base regression functions and neural network architectures, problem instances with different dimensions and levels of difficulty can be created. We demonstrate the use of the CORNN Suite by comparing the performance of three evolutionary and swarm-based algorithms on a set of over 300 problem instances. With the exception of random search, we provide evidence of performance complementarity between the algorithms. As a baseline, results are also provided to contrast the performance of the best population-based algorithm against a gradient-based approach (Adam). The suite is shared as a public web repository to facilitate easy integration with existing benchmarking platforms.

READ FULL TEXT

page 1

page 5

page 6

page 7

page 12

research
11/02/2018

Deep Optimisation: Solving Combinatorial Optimisation Problems using Deep Neural Networks

Deep Optimisation (DO) combines evolutionary search with Deep Neural Net...
research
03/06/2019

A Scalable Test Suite for Continuous Dynamic Multiobjective Optimisation

Dynamic multiobjective optimisation has gained increasing attention in r...
research
06/10/2022

Training Neural Networks using SAT solvers

We propose an algorithm to explore the global optimization method, using...
research
04/15/2021

On the Assessment of Benchmark Suites for Algorithm Comparison

Benchmark suites, i.e. a collection of benchmark functions, are widely u...
research
10/01/2022

NeuroEvo: A Cloud-based Platform for Automated Design and Training of Neural Networks using Evolutionary and Particle Swarm Algorithms

Evolutionary algorithms (EAs) provide unique advantages for optimizing n...
research
08/10/2020

Using Neural Networks and Diversifying Differential Evolution for Dynamic Optimisation

Dynamic optimisation occurs in a variety of real-world problems. To tack...
research
06/26/2018

Limited Evaluation Evolutionary Optimization of Large Neural Networks

Stochastic gradient descent is the most prevalent algorithm to train neu...

Please sign up or login with your details

Forgot password? Click here to reset