Deep Active Learning with a Neural Architecture Search

11/19/2018
by   Yonatan Geifman, et al.
0

We consider active learning of deep neural networks. Most active learning works in this context have focused on studying effective querying mechanisms and assumed that an appropriate network architecture is a priori known for the problem at hand. We challenge this assumption and propose a novel active strategy whereby the learning algorithm searches for effective architectures on the fly, while actively learning. We apply our strategy using three known querying techniques (softmax response, MC-dropout, and coresets) and show that the proposed approach overwhelmingly outperforms active learning using fixed architectures.

READ FULL TEXT
research
06/05/2023

Deep Active Learning with Structured Neural Depth Search

Previous work optimizes traditional active learning (AL) processes with ...
research
03/05/2023

Streaming Active Learning with Deep Neural Networks

Active learning is perhaps most naturally posed as an online learning pr...
research
06/17/2021

Gone Fishing: Neural Active Learning with Fisher Embeddings

There is an increasing need for effective active learning algorithms tha...
research
11/08/2018

Large-Scale Visual Active Learning with Deep Probabilistic Ensembles

Annotating the right data for training deep neural networks is an import...
research
12/19/2016

Active and Continuous Exploration with Deep Neural Networks and Expected Model Output Changes

The demands on visual recognition systems do not end with the complexity...
research
02/14/2022

Active Surrogate Estimators: An Active Learning Approach to Label-Efficient Model Evaluation

We propose Active Surrogate Estimators (ASEs), a new method for label-ef...
research
11/04/2022

Improved Adaptive Algorithm for Scalable Active Learning with Weak Labeler

Active learning with strong and weak labelers considers a practical sett...

Please sign up or login with your details

Forgot password? Click here to reset