Towards Self-supervised and Weight-preserving Neural Architecture Search

06/08/2022
by   Zhuowei Li, et al.
16

Neural architecture search (NAS) algorithms save tremendous labor from human experts. Recent advancements further reduce the computational overhead to an affordable level. However, it is still cumbersome to deploy the NAS techniques in real-world applications due to the fussy procedures and the supervised learning paradigm. In this work, we propose the self-supervised and weight-preserving neural architecture search (SSWP-NAS) as an extension of the current NAS framework by allowing the self-supervision and retaining the concomitant weights discovered during the search stage. As such, we simplify the workflow of NAS to a one-stage and proxy-free procedure. Experiments show that the architectures searched by the proposed framework achieve state-of-the-art accuracy on CIFAR-10, CIFAR-100, and ImageNet datasets without using manual labels. Moreover, we show that employing the concomitant weights as initialization consistently outperforms the random initialization and the two-stage weight pre-training method by a clear margin under semi-supervised learning scenarios. Codes are publicly available at https://github.com/LzVv123456/SSWP-NAS.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2022

Proxyless Neural Architecture Adaptation for Supervised Learning and Self-Supervised Learning

Recently, Neural Architecture Search (NAS) methods have been introduced ...
research
03/15/2021

Pretraining Neural Architecture Search Controllers with Locality-based Self-Supervised Learning

Neural architecture search (NAS) has fostered various fields of machine ...
research
09/17/2021

Self-Supervised Neural Architecture Search for Imbalanced Datasets

Neural Architecture Search (NAS) provides state-of-the-art results when ...
research
04/03/2023

Self-Supervised learning for Neural Architecture Search (NAS)

The objective of this internship is to propose an innovative method that...
research
01/31/2023

NASiam: Efficient Representation Learning using Neural Architecture Search for Siamese Networks

Siamese networks are one of the most trending methods to achieve self-su...
research
04/23/2020

Depth-Wise Neural Architecture Search

Modern convolutional networks such as ResNet and NASNet have achieved st...
research
02/24/2020

Semi-Supervised Neural Architecture Search

Neural architecture search (NAS) relies on a good controller to generate...

Please sign up or login with your details

Forgot password? Click here to reset