Neural Networks Designing Neural Networks: Multi-Objective Hyper-Parameter Optimization

11/07/2016
by   Sean C. Smithson, et al.
0

Artificial neural networks have gone through a recent rise in popularity, achieving state-of-the-art results in various fields, including image classification, speech recognition, and automated control. Both the performance and computational complexity of such models are heavily dependant on the design of characteristic hyper-parameters (e.g., number of hidden layers, nodes per layer, or choice of activation functions), which have traditionally been optimized manually. With machine learning penetrating low-power mobile and embedded areas, the need to optimize not only for performance (accuracy), but also for implementation complexity, becomes paramount. In this work, we present a multi-objective design space exploration method that reduces the number of solution networks trained and evaluated through response surface modelling. Given spaces which can easily exceed 1020 solutions, manually designing a near-optimal architecture is unlikely as opportunities to reduce network complexity, while maintaining performance, may be overlooked. This problem is exacerbated by the fact that hyper-parameters which perform well on specific datasets may yield sub-par results on others, and must therefore be designed on a per-application basis. In our work, machine learning is leveraged by training an artificial neural network to predict the performance of future candidate networks. The method is evaluated on the MNIST and CIFAR-10 image datasets, optimizing for both recognition accuracy and computational complexity. Experimental results demonstrate that the proposed method can closely approximate the Pareto-optimal front, while only exploring a small fraction of the design space.

READ FULL TEXT
research
05/25/2022

Concurrent Neural Tree and Data Preprocessing AutoML for Image Classification

Deep Neural Networks (DNN's) are a widely-used solution for a variety of...
research
11/28/2018

The SWAG Algorithm; a Mathematical Approach that Outperforms Traditional Deep Learning. Theory and Implementation

The performance of artificial neural networks (ANNs) is influenced by we...
research
10/10/2018

Automatic Configuration of Deep Neural Networks with EGO

Designing the architecture for an artificial neural network is a cumbers...
research
02/02/2021

Fast Exploration of Weight Sharing Opportunities for CNN Compression

The computational workload involved in Convolutional Neural Networks (CN...
research
01/04/2018

DENSER: Deep Evolutionary Network Structured Representation

Deep Evolutionary Network Structured Representation (DENSER) is a novel ...
research
04/08/2020

GeneCAI: Genetic Evolution for Acquiring Compact AI

In the contemporary big data realm, Deep Neural Networks (DNNs) are evol...
research
06/28/2019

Mise en abyme with artificial intelligence: how to predict the accuracy of NN, applied to hyper-parameter tuning

In the context of deep learning, the costliest phase from a computationa...

Please sign up or login with your details

Forgot password? Click here to reset