EvoPrompting: Language Models for Code-Level Neural Architecture Search

02/28/2023
by   Angelica Chen, et al.
0

Given the recent impressive accomplishments of language models (LMs) for code generation, we explore the use of LMs as adaptive mutation and crossover operators for an evolutionary neural architecture search (NAS) algorithm. While NAS still proves too difficult a task for LMs to succeed at solely through prompting, we find that the combination of evolutionary prompt engineering with soft prompt-tuning, a method we term EvoPrompting, consistently finds diverse and high performing models. We first demonstrate that EvoPrompting is effective on the computationally efficient MNIST-1D dataset, where EvoPrompting produces convolutional architecture variants that outperform both those designed by human experts and naive few-shot prompting in terms of accuracy and model size. We then apply our method to searching for graph neural networks on the CLRS Algorithmic Reasoning Benchmark, where EvoPrompting is able to design novel architectures that outperform current state-of-the-art models on 21 out of 30 algorithmic reasoning tasks while maintaining similar model size. EvoPrompting is successful at designing accurate and efficient neural network architectures across a variety of machine learning tasks, while also being general enough for easy adaptation to other tasks beyond neural network design.

READ FULL TEXT

page 15

page 16

page 17

page 18

page 19

page 20

page 21

page 22

research
08/01/2018

Reinforced Evolutionary Neural Architecture Search

Neural architecture search (NAS) is an important task in network design,...
research
08/24/2020

Automated Search for Resource-Efficient Branched Multi-Task Networks

The multi-modal nature of many vision problems calls for neural network ...
research
03/12/2021

Neural Architecture Search based on Cartesian Genetic Programming Coding Method

Neural architecture search (NAS) is a hot topic in the field of automate...
research
06/01/2023

LLMatic: Neural Architecture Search via Large Language Models and Quality-Diversity Optimization

Large Language Models (LLMs) have emerged as powerful tools capable of a...
research
02/28/2020

ImmuNetNAS: An Immune-network approach for searching Convolutional Neural Network Architectures

In this research, we propose ImmuNetNAS, a novel Neural Architecture Sea...
research
03/10/2018

Evolutionary Architecture Search For Deep Multitask Networks

Multitask learning, i.e. learning several tasks at once with the same ne...
research
04/20/2022

SpiderNet: Hybrid Differentiable-Evolutionary Architecture Search via Train-Free Metrics

Neural Architecture Search (NAS) algorithms are intended to remove the b...

Please sign up or login with your details

Forgot password? Click here to reset