NeuralArTS: Structuring Neural Architecture Search with Type Theory

10/17/2021
by   Robert Wu, et al.
0

Neural Architecture Search (NAS) algorithms automate the task of finding optimal deep learning architectures given an initial search space of possible operations. Developing these search spaces is usually a manual affair with pre-optimized search spaces being more efficient, rather than searching from scratch. In this paper we present a new framework called Neural Architecture Type System (NeuralArTS) that categorizes the infinite set of network operations in a structured type system. We further demonstrate how NeuralArTS can be applied to convolutional layers and propose several future directions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

Poisoning the Search Space in Neural Architecture Search

Deep learning has proven to be a highly effective problem-solving tool f...
research
10/07/2021

Conceptual Expansion Neural Architecture Search (CENAS)

Architecture search optimizes the structure of a neural network for some...
research
09/30/2019

Towards modular and programmable architecture search

Neural architecture search methods are able to find high performance dee...
research
03/25/2021

Recovering Quantitative Models of Human Information Processing with Differentiable Architecture Search

The integration of behavioral phenomena into mechanistic models of cogni...
research
05/22/2020

An Introduction to Neural Architecture Search for Convolutional Networks

Neural Architecture Search (NAS) is a research field concerned with util...
research
11/03/2022

Towards Discovering Neural Architectures from Scratch

The discovery of neural architectures from scratch is the long-standing ...
research
07/12/2020

VINNAS: Variational Inference-based Neural Network Architecture Search

In recent years, neural architecture search (NAS) has received intensive...

Please sign up or login with your details

Forgot password? Click here to reset