Self-Supervised Neural Architecture Search for Imbalanced Datasets

09/17/2021
by   Aleksandr Timofeev, et al.
0

Neural Architecture Search (NAS) provides state-of-the-art results when trained on well-curated datasets with annotated labels. However, annotating data or even having balanced number of samples can be a luxury for practitioners from different scientific fields, e.g., in the medical domain. To that end, we propose a NAS-based framework that bears the threefold contributions: (a) we focus on the self-supervised scenario, i.e., where no labels are required to determine the architecture, and (b) we assume the datasets are imbalanced, (c) we design each component to be able to run on a resource constrained setup, i.e., on a single GPU (e.g. Google Colab). Our components build on top of recent developments in self-supervised learning~\citep{zbontar2021barlow}, self-supervised NAS~\citep{kaplan2020self} and extend them for the case of imbalanced datasets. We conduct experiments on an (artificially) imbalanced version of CIFAR-10 and we demonstrate our proposed method outperforms standard neural networks, while using $27\times$ less parameters. To validate our assumption on a naturally imbalanced dataset, we also conduct experiments on ChestMNIST and COVID-19 X-ray. The results demonstrate how the proposed method can be used in imbalanced datasets, while it can be fully run on a single GPU. Code is available \href{https://github.com/TimofeevAlex/ssnas_imbalanced}{here}.

READ FULL TEXT
research
07/03/2020

Self-supervised Neural Architecture Search

Neural Architecture Search (NAS) has been used recently to achieve impro...
research
06/08/2022

Towards Self-supervised and Weight-preserving Neural Architecture Search

Neural architecture search (NAS) algorithms save tremendous labor from h...
research
09/30/2022

IMB-NAS: Neural Architecture Search for Imbalanced Datasets

Class imbalance is a ubiquitous phenomenon occurring in real world data ...
research
03/15/2021

Pretraining Neural Architecture Search Controllers with Locality-based Self-Supervised Learning

Neural architecture search (NAS) has fostered various fields of machine ...
research
09/06/2022

Robust and Efficient Imbalanced Positive-Unlabeled Learning with Self-supervision

Learning from positive and unlabeled (PU) data is a setting where the le...
research
04/03/2023

Self-Supervised learning for Neural Architecture Search (NAS)

The objective of this internship is to propose an innovative method that...
research
03/26/2020

Are Labels Necessary for Neural Architecture Search?

Existing neural network architectures in computer vision — whether desig...

Please sign up or login with your details

Forgot password? Click here to reset