A Design Space Study for LISTA and Beyond

04/08/2021
by   Tianjian Meng, et al.
0

In recent years, great success has been witnessed in building problem-specific deep networks from unrolling iterative algorithms, for solving inverse problems and beyond. Unrolling is believed to incorporate the model-based prior with the learning capacity of deep learning. This paper revisits the role of unrolling as a design approach for deep networks: to what extent its resulting special architecture is superior, and can we find better? Using LISTA for sparse recovery as a representative example, we conduct the first thorough design space study for the unrolled models. Among all possible variations, we focus on extensively varying the connectivity patterns and neuron types, leading to a gigantic design space arising from LISTA. To efficiently explore this space and identify top performers, we leverage the emerging tool of neural architecture search (NAS). We carefully examine the searched top architectures in a number of settings, and are able to discover networks that are consistently better than LISTA. We further present more visualization and analysis to "open the black box", and find that the searched top architectures demonstrate highly consistent and potentially transferable patterns. We hope our study to spark more reflections and explorations on how to better mingle model-based optimization prior and data-driven learning.

READ FULL TEXT

page 8

page 12

research
02/23/2021

Neural Architecture Search on ImageNet in Four GPU Hours: A Theoretically Inspired Perspective

Neural Architecture Search (NAS) has been explosively studied to automat...
research
04/23/2023

LayerNAS: Neural Architecture Search in Polynomial Complexity

Neural Architecture Search (NAS) has become a popular method for discove...
research
05/08/2022

αNAS: Neural Architecture Search using Property Guided Synthesis

In the past few years, neural architecture search (NAS) has become an in...
research
07/07/2020

Hyperparameter Optimization in Neural Networks via Structured Sparse Recovery

In this paper, we study two important problems in the automated design o...
research
11/08/2021

An Approach for Combining Multimodal Fusion and Neural Architecture Search Applied to Knowledge Tracing

Knowledge Tracing is the process of tracking mastery level of different ...
research
06/06/2019

StyleNAS: An Empirical Study of Neural Architecture Search to Uncover Surprisingly Fast End-to-End Universal Style Transfer Networks

Neural Architecture Search (NAS) has been widely studied for designing d...
research
04/17/2020

DynamicEmbedding: Extending TensorFlow for Colossal-Scale Applications

One of the limitations of deep learning models with sparse features toda...

Please sign up or login with your details

Forgot password? Click here to reset