αNAS: Neural Architecture Search using Property Guided Synthesis

05/08/2022
by   Charles Jin, et al.
0

In the past few years, neural architecture search (NAS) has become an increasingly important tool within the deep learning community. Despite the many recent successes of NAS, however, most existing approaches operate within highly structured design spaces, and hence explore only a small fraction of the full search space of neural architectures while also requiring significant manual effort from domain experts. In this work, we develop techniques that enable efficient NAS in a significantly larger design space. To accomplish this, we propose to perform NAS in an abstract search space of program properties. Our key insights are as follows: (1) the abstract search space is significantly smaller than the original search space, and (2) architectures with similar program properties also have similar performance; thus, we can search more efficiently in the abstract search space. To enable this approach, we also propose a novel efficient synthesis procedure, which accepts a set of promising program properties, and returns a satisfying neural architecture. We implement our approach, αNAS, within an evolutionary framework, where the mutations are guided by the program properties. Starting with a ResNet-34 model, αNAS produces a model with slightly improved accuracy on CIFAR-10 but 96 Vision Transformer (30 FLOPS, 14 without any degradation in accuracy.

READ FULL TEXT
research
06/28/2021

Poisoning the Search Space in Neural Architecture Search

Deep learning has proven to be a highly effective problem-solving tool f...
research
06/19/2019

Transfer NAS: Knowledge Transfer between Search Spaces with Transformer Agents

Recent advances in Neural Architecture Search (NAS) have produced state-...
research
05/15/2020

Optimizing Neural Architecture Search using Limited GPU Time in a Dynamic Search Space: A Gene Expression Programming Approach

Efficient identification of people and objects, segmentation of regions ...
research
05/11/2023

Backpropagation-Free 4D Continuous Ant-Based Neural Topology Search

Continuous Ant-based Topology Search (CANTS) is a previously introduced ...
research
11/01/2020

Neural Network Design: Learning from Neural Architecture Search

Neural Architecture Search (NAS) aims to optimize deep neural networks' ...
research
08/23/2023

A Benchmark Study on Calibration

Deep neural networks are increasingly utilized in various machine learni...
research
04/08/2021

A Design Space Study for LISTA and Beyond

In recent years, great success has been witnessed in building problem-sp...

Please sign up or login with your details

Forgot password? Click here to reset