Optimizing Deep Neural Network Architecture: A Tabu Search Based Approach

08/17/2018
by   Tarun Kumar Gupta, et al.
0

The performance of Feedforward neural network (FNN) fully de-pends upon the selection of architecture and training algorithm. FNN architecture can be tweaked using several parameters, such as the number of hidden layers, number of hidden neurons at each hidden layer and number of connections between layers. There may be exponential combinations for these architectural attributes which may be unmanageable manually, so it requires an algorithm which can automatically design an optimal architecture with high generalization ability. Numerous optimization algorithms have been utilized for FNN architecture determination. This paper proposes a new methodology which can work on the estimation of hidden layers and their respective neurons for FNN. This work combines the advantages of Tabu search (TS) and Gradient descent with momentum backpropagation (GDM) training algorithm to demonstrate how Tabu search can automatically select the best architecture from the populated architectures based on minimum testing error criteria. The proposed approach has been tested on four classification benchmark dataset of different size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2010

Medical diagnosis using neural network

This research is to search for alternatives to the resolution of complex...
research
03/03/2020

Deep Learning in Memristive Nanowire Networks

Analog crossbar architectures for accelerating neural network training a...
research
10/21/2018

MS-BACO: A new Model Selection algorithm using Binary Ant Colony Optimization for neural complexity and error reduction

Stabilizing the complexity of Feedforward Neural Networks (FNNs) for the...
research
11/09/2018

Gradient Descent Finds Global Minima of Deep Neural Networks

Gradient descent finds a global minimum in training deep neural networks...
research
09/23/2010

A Constructive Algorithm for Feedforward Neural Networks for Medical Diagnostic Reasoning

This research is to search for alternatives to the resolution of complex...
research
02/13/2017

Training Neural Networks Based on Imperialist Competitive Algorithm for Predicting Earthquake Intensity

In this study we determined neural network weights and biases by Imperia...
research
05/16/2017

Metaheuristic Design of Feedforward Neural Networks: A Review of Two Decades of Research

Over the past two decades, the feedforward neural network (FNN) optimiza...

Please sign up or login with your details

Forgot password? Click here to reset