Neural Architecture Search via Bregman Iterations

06/04/2021
by   Leon Bungert, et al.
0

We propose a novel strategy for Neural Architecture Search (NAS) based on Bregman iterations. Starting from a sparse neural network our gradient-based one-shot algorithm gradually adds relevant parameters in an inverse scale space manner. This allows the network to choose the best architecture in the search space which makes it well-designed for a given task, e.g., by adding neurons or skip connections. We demonstrate that using our approach one can unveil, for instance, residual autoencoders for denoising, deblurring, and classification tasks. Code is available at https://github.com/TimRoith/BregmanLearning.

READ FULL TEXT

page 2

page 6

research
11/13/2021

Towards One Shot Search Space Poisoning in Neural Architecture Search

We evaluate the robustness of a Neural Architecture Search (NAS) algorit...
research
01/16/2020

MixPath: A Unified Approach for One-shot Neural Architecture Search

The expressiveness of search space is a key concern in neural architectu...
research
03/31/2020

MTL-NAS: Task-Agnostic Neural Architecture Search towards General-Purpose Multi-Task Learning

We propose to incorporate neural architecture search (NAS) into general-...
research
12/11/2020

AdvantageNAS: Efficient Neural Architecture Search with Credit Assignment

Neural architecture search (NAS) is an approach for automatically design...
research
11/26/2021

KNAS: Green Neural Architecture Search

Many existing neural architecture search (NAS) solutions rely on downstr...
research
08/13/2020

Can weight sharing outperform random architecture search? An investigation with TuNAS

Efficient Neural Architecture Search methods based on weight sharing hav...
research
09/01/2019

Neural Architecture Search for Joint Optimization of Predictive Power and Biological Knowledge

We report a neural architecture search framework, BioNAS, that is tailor...

Please sign up or login with your details

Forgot password? Click here to reset