DeepAI AI Chat
Log In Sign Up

HMCNAS: Neural Architecture Search using Hidden Markov Chains and Bayesian Optimization

by   Vasco Lopes, et al.

Neural Architecture Search has achieved state-of-the-art performance in a variety of tasks, out-performing human-designed networks. However, many assumptions, that require human definition, related with the problems being solved or the models generated are still needed: final model architectures, number of layers to be sampled, forced operations, small search spaces, which ultimately contributes to having models with higher performances at the cost of inducing bias into the system. In this paper, we propose HMCNAS, which is composed of two novel components: i) a method that leverages information about human-designed models to autonomously generate a complex search space, and ii) an Evolutionary Algorithm with Bayesian Optimization that is capable of generating competitive CNNs from scratch, without relying on human-defined parameters or small search spaces. The experimental results show that the proposed approach results in competitive architectures obtained in a very short time. HMCNAS provides a step towards generalizing NAS, by providing a way to create competitive models, without requiring any human knowledge about the specific task.


page 1

page 2

page 3

page 4


Towards Less Constrained Macro-Neural Architecture Search

Networks found with Neural Architecture Search (NAS) achieve state-of-th...

BANANAS: Bayesian Optimization with Neural Architectures for Neural Architecture Search

Neural Architecture Search (NAS) has seen an explosion of research in th...

Transfer NAS: Knowledge Transfer between Search Spaces with Transformer Agents

Recent advances in Neural Architecture Search (NAS) have produced state-...

Towards Discovering Neural Architectures from Scratch

The discovery of neural architectures from scratch is the long-standing ...

Evolving Search Space for Neural Architecture Search

The automation of neural architecture design has been a coveted alternat...

AutoDispNet: Improving Disparity Estimation with AutoML

Much research work in computer vision is being spent on optimizing exist...

Bayesian Neural Architecture Search using A Training-Free Performance Metric

Recurrent neural networks (RNNs) are a powerful approach for time series...