DeepAI AI Chat
Log In Sign Up

Contrastive Embeddings for Neural Architectures

02/08/2021
by   Daniel Hesslow, et al.
0

The performance of algorithms for neural architecture search strongly depends on the parametrization of the search space. We use contrastive learning to identify networks across different initializations based on their data Jacobians, and automatically produce the first architecture embeddings independent from the parametrization of the search space. Using our contrastive embeddings, we show that traditional black-box optimization algorithms, without modification, can reach state-of-the-art performance in Neural Architecture Search. As our method provides a unified embedding space, we perform for the first time transfer learning between search spaces. Finally, we show the evolution of embeddings during training, motivating future studies into using embeddings at different training stages to gain a deeper understanding of the networks in a search space.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/09/2020

Smooth Variational Graph Embeddings for Efficient Neural Architecture Search

In this paper, we propose an approach to neural architecture search (NAS...
11/22/2020

Evolving Search Space for Neural Architecture Search

The automation of neural architecture design has been a coveted alternat...
12/27/2018

Neural Architecture Search Over a Graph Search Space

Neural architecture search (NAS) enabled the discovery of state-of-the-a...
02/28/2020

Neural Inheritance Relation Guided One-Shot Layer Assignment Search

Layer assignment is seldom picked out as an independent research topic i...
09/30/2019

Towards modular and programmable architecture search

Neural architecture search methods are able to find high performance dee...
02/01/2019

Learnable Embedding Space for Efficient Neural Architecture Compression

We propose a method to incrementally learn an embedding space over the d...