MS-BACO: A new Model Selection algorithm using Binary Ant Colony Optimization for neural complexity and error reduction

10/21/2018
by   Saman Sadeghyan, et al.
0

Stabilizing the complexity of Feedforward Neural Networks (FNNs) for the given approximation task can be managed by defining an appropriate model magnitude which is also greatly correlated with the generalization quality and computational efficiency. However, deciding on the right level of model complexity can be highly challenging in FNN applications. In this paper, a new Model Selection algorithm using Binary Ant Colony Optimization (MS-BACO) is proposed in order to achieve the optimal FNN model in terms of neural complexity and cross-entropy error. MS-BACO is a meta-heuristic algorithm that treats the problem as a combinatorial optimization problem. By quantifying both the amount of correlation exists among hidden neurons and the sensitivity of the FNN output to the hidden neurons using a sample-based sensitivity analysis method called, extended Fourier amplitude sensitivity test, the algorithm mostly tends to select the FNN model containing hidden neurons with most distinct hyperplanes and high contribution percentage. Performance of the proposed algorithm with three different designs of heuristic information is investigated. Comparison of the findings verifies that the newly introduced algorithm is able to provide more compact and accurate FNN model.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

08/17/2018

Optimizing Deep Neural Network Architecture: A Tabu Search Based Approach

The performance of Feedforward neural network (FNN) fully de-pends upon ...
06/21/2021

A generalized EMS algorithm for model selection with incomplete data

Recently, a so-called E-MS algorithm was developed for model selection i...
04/13/2018

A new robust feature selection method using variance-based sensitivity analysis

Excluding irrelevant features in a pattern recognition task plays an imp...
10/02/2020

The Efficacy of L_1 Regularization in Two-Layer Neural Networks

A crucial problem in neural networks is to select the most appropriate n...
08/26/2021

Consistent Relative Confidence and Label-Free Model Selection for Convolutional Neural Networks

This paper is concerned with image classification based on deep convolut...
11/23/2016

Tunable Sensitivity to Large Errors in Neural Network Training

When humans learn a new concept, they might ignore examples that they ca...
08/05/2017

Quantifying homologous proteins and proteoforms

Many proteoforms - arising from alternative splicing, post-translational...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.