Conceptual Expansion Neural Architecture Search (CENAS)

10/07/2021
by   Mohan Singamsetti, et al.
0

Architecture search optimizes the structure of a neural network for some task instead of relying on manual authoring. However, it is slow, as each potential architecture is typically trained from scratch. In this paper we present an approach called Conceptual Expansion Neural Architecture Search (CENAS) that combines a sample-efficient, computational creativity-inspired transfer learning approach with neural architecture search. This approach finds models faster than naive architecture search via transferring existing weights to approximate the parameters of the new model. It outperforms standard transfer learning by allowing for the addition of features instead of only modifying existing features. We demonstrate that our approach outperforms standard neural architecture search and transfer learning methods in terms of efficiency, performance, and parameter counts on a variety of transfer learning tasks.

READ FULL TEXT

page 7

page 8

research
10/17/2021

NeuralArTS: Structuring Neural Architecture Search with Type Theory

Neural Architecture Search (NAS) algorithms automate the task of finding...
research
12/28/2022

Breaking the Architecture Barrier: A Method for Efficient Knowledge Transfer Across Networks

Transfer learning is a popular technique for improving the performance o...
research
03/30/2021

A resource-efficient method for repeated HPO and NAS problems

In this work we consider the problem of repeated hyperparameter and neur...
research
07/22/2022

Hyper-Representations for Pre-Training and Transfer Learning

Learning representations of neural network weights given a model zoo is ...
research
08/19/2019

Architecture Search by Estimation of Network Structure Distributions

The influence of deep learning is continuously expanding across differen...
research
06/05/2020

Learning to Rank Learning Curves

Many automated machine learning methods, such as those for hyperparamete...
research
09/29/2022

Hyper-Representations as Generative Models: Sampling Unseen Neural Network Weights

Learning representations of neural network weights given a model zoo is ...

Please sign up or login with your details

Forgot password? Click here to reset