Trilevel Neural Architecture Search for Efficient Single Image Super-Resolution

01/17/2021
by   Yan Wu, et al.
0

This paper proposes a trilevel neural architecture search (NAS) method for efficient single image super-resolution (SR). For that, we first define the discrete search space at three-level, i.e., at network-level, cell-level, and kernel-level (convolution-kernel). For modeling the discrete search space, we apply a new continuous relaxation on the discrete search spaces to build a hierarchical mixture of network-path, cell-operations, and kernel-width. Later an efficient search algorithm is proposed to perform optimization in a hierarchical supernet manner that provides a globally optimized and compressed network via joint convolution kernel width pruning, cell structure search, and network path optimization. Unlike current NAS methods, we exploit a sorted sparsestmax activation to let the three-level neural structures contribute sparsely. Consequently, our NAS optimization progressively converges to those neural structures with dominant contributions to the supernet. Additionally, our proposed optimization construction enables a simultaneous search and training in a single phase, which dramatically reduces search and train time compared to the traditional NAS algorithms. Experiments on the standard benchmark datasets demonstrate that our NAS algorithm provides SR models that are significantly lighter in terms of the number of parameters and FLOPS with PSNR value comparable to the current state-of-the-art.

READ FULL TEXT

page 6

page 12

research
05/09/2021

Lightweight Image Super-Resolution with Hierarchical and Differentiable Neural Architecture Search

Single Image Super-Resolution (SISR) tasks have achieved significant per...
research
03/10/2020

Hierarchical Neural Architecture Search for Single Image Super-Resolution

Deep neural networks have exhibited promising performance in image super...
research
08/31/2022

QuantNAS for super resolution: searching for efficient quantization-friendly architectures against quantization noise

There is a constant need for high-performing and computationally efficie...
research
06/23/2019

Densely Connected Search Space for More Flexible Neural Architecture Search

In recent years, neural architecture search (NAS) has dramatically advan...
research
03/30/2020

DHP: Differentiable Meta Pruning via HyperNetworks

Network pruning has been the driving force for the efficient inference o...
research
09/02/2020

Real Image Super Resolution Via Heterogeneous Model using GP-NAS

With advancement in deep neural network (DNN), recent state-of-the-art (...
research
11/18/2019

Fine-Grained Neural Architecture Search

We present an elegant framework of fine-grained neural architecture sear...

Please sign up or login with your details

Forgot password? Click here to reset