DARTS for Inverse Problems: a Study on Hyperparameter Sensitivity

08/12/2021
by   Jonas Geiping, et al.
0

Differentiable architecture search (DARTS) is a widely researched tool for neural architecture search, due to its promising results for image classification. The main benefit of DARTS is the effectiveness achieved through the weight-sharing one-shot paradigm, which allows efficient architecture search. In this work, we investigate DARTS in a systematic case study of inverse problems, which allows us to analyze these potential benefits in a controlled manner. Although we demonstrate that the success of DARTS can be extended from image classification to reconstruction, our experiments yield three fundamental difficulties in the evaluation of DARTS-based methods: First, the results show a large variance in all test cases. Second, the final performance is highly dependent on the hyperparameters of the optimizer. And third, the performance of the weight-sharing architecture used during training does not reflect the final performance of the found architecture well. Thus, we conclude the necessity to 1) report the results of any DARTS-based methods from several runs along with its underlying performance statistics, 2) show the correlation of the training and final architecture performance, and 3) carefully consider if the computational efficiency of DARTS outweighs the costs of hyperparameter optimization and multiple runs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2022

WPNAS: Neural Architecture Search by jointly using Weight Sharing and Predictor

Weight sharing based and predictor based methods are two major types of ...
research
08/01/2018

Efficient Progressive Neural Architecture Search

This paper addresses the difficult problem of finding an optimal neural ...
research
08/13/2020

Can weight sharing outperform random architecture search? An investigation with TuNAS

Efficient Neural Architecture Search methods based on weight sharing hav...
research
01/06/2020

Deeper Insights into Weight Sharing in Neural Architecture Search

With the success of deep neural networks, Neural Architecture Search (NA...
research
06/08/2020

Revisiting the Train Loss: an Efficient Performance Estimator for Neural Architecture Search

Reliable yet efficient evaluation of generalisation performance of a pro...
research
10/29/2020

Cream of the Crop: Distilling Prioritized Paths For One-Shot Neural Architecture Search

One-shot weight sharing methods have recently drawn great attention in n...
research
09/20/2019

Understanding and Robustifying Differentiable Architecture Search

Differentiable Architecture Search (DARTS) has attracted a lot of attent...

Please sign up or login with your details

Forgot password? Click here to reset