DeepAI AI Chat
Log In Sign Up

Autonomously and Simultaneously Refining Deep Neural Network Parameters by a Bi-Generative Adversarial Network Aided Genetic Algorithm

by   Yantao Lu, et al.
Syracuse University

The choice of parameters, and the design of the network architecture are important factors affecting the performance of deep neural networks. Genetic Algorithms (GA) have been used before to determine parameters of a network. Yet, GAs perform a finite search over a discrete set of pre-defined candidates, and cannot, in general, generate unseen configurations. In this paper, to move from exploration to exploitation, we propose a novel and systematic method that autonomously and simultaneously optimizes multiple parameters of any deep neural network by using a GA aided by a bi-generative adversarial network (Bi-GAN). The proposed Bi-GAN allows the autonomous exploitation and choice of the number of neurons, for fully-connected layers, and number of filters, for convolutional layers, from a large range of values. Our proposed Bi-GAN involves two generators, and two different models compete and improve each other progressively with a GAN-based strategy to optimize the networks during GA evolution. Our proposed approach can be used to autonomously refine the number of convolutional layers and dense layers, number and size of kernels, and the number of neurons for the dense layers; choose the type of the activation function; and decide whether to use dropout and batch normalization or not, to improve the accuracy of different deep neural network architectures. Without loss of generality, the proposed method has been tested with the ModelNet database, and compared with the 3D Shapenets and two GA-only methods. The results show that the presented approach can simultaneously and successfully optimize multiple neural network parameters, and achieve higher accuracy even with shallower networks.


page 1

page 2

page 3

page 4


Autonomously and Simultaneously Refining Deep Neural Network Parameters by Generative Adversarial Networks

The choice of parameters, and the design of the network architecture are...

FCC-GAN: A Fully Connected and Convolutional Net Architecture for GANs

Generative Adversarial Networks (GANs) are a powerful class of generativ...

Recombination of Artificial Neural Networks

We propose a genetic algorithm (GA) for hyperparameter optimization of a...

DropFilter: Dropout for Convolutions

Using a large number of parameters , deep neural networks have achieved ...

Genetic Network Architecture Search

We propose a method for learning the neural network architecture that ba...

Augmenting High-dimensional Nonlinear Optimization with Conditional GANs

Many mathematical optimization algorithms fail to sufficiently explore t...

Genetic Algorithms for Evolving Deep Neural Networks

In recent years, deep learning methods applying unsupervised learning to...