Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

05/11/2023
by   AbdElRahman ElSaid, et al.
0

Continuous Ant-based Topology Search (CANTS) is a previously introduced novel nature-inspired neural architecture search (NAS) algorithm that is based on ant colony optimization (ACO). CANTS utilizes a continuous search space to indirectly-encode a neural architecture search space. Synthetic ant agents explore CANTS' continuous search space based on the density and distribution of pheromones, strongly inspired by how ants move in the real world. This continuous search space allows CANTS to automate the design of artificial neural networks (ANNs) of any size, removing a key limitation inherent to many current NAS algorithms that must operate within structures of a size that is predetermined by the user. This work expands CANTS by adding a fourth dimension to its search space representing potential neural synaptic weights. Adding this extra dimension allows CANTS agents to optimize both the architecture as well as the weights of an ANN without applying backpropagation (BP), which leads to a significant reduction in the time consumed in the optimization process. The experiments of this study - using real-world data - demonstrate that the BP-Free CANTS algorithm exhibits highly competitive performance compared to both CANTS and ANTS while requiring significantly less operation time.

READ FULL TEXT
research
11/21/2020

Continuous Ant-Based Neural Topology Search

This work introduces a novel, nature-inspired neural architecture search...
research
05/08/2022

αNAS: Neural Architecture Search using Property Guided Synthesis

In the past few years, neural architecture search (NAS) has become an in...
research
11/24/2018

Evolutionary-Neural Hybrid Agents for Architecture Search

Neural Architecture Search has recently shown potential to automate the ...
research
07/18/2021

A Novel Evolutionary Algorithm for Hierarchical Neural Architecture Search

In this work, we propose a novel evolutionary algorithm for neural archi...
research
05/21/2019

Adaptive Stochastic Natural Gradient Method for One-Shot Neural Architecture Search

High sensitivity of neural architecture search (NAS) methods against the...
research
09/01/2023

ICDARTS: Improving the Stability and Performance of Cyclic DARTS

This work introduces improvements to the stability and generalizability ...
research
12/24/2021

DARTS without a Validation Set: Optimizing the Marginal Likelihood

The success of neural architecture search (NAS) has historically been li...

Please sign up or login with your details

Forgot password? Click here to reset