Hybrid ISTA: Unfolding ISTA With Convergence Guarantees Using Free-Form Deep Neural Networks

04/25/2022
by   Ziyang Zheng, et al.
0

It is promising to solve linear inverse problems by unfolding iterative algorithms (e.g., iterative shrinkage thresholding algorithm (ISTA)) as deep neural networks (DNNs) with learnable parameters. However, existing ISTA-based unfolded algorithms restrict the network architectures for iterative updates with the partial weight coupling structure to guarantee convergence. In this paper, we propose hybrid ISTA to unfold ISTA with both pre-computed and learned parameters by incorporating free-form DNNs (i.e., DNNs with arbitrary feasible and reasonable network architectures), while ensuring theoretical convergence. We first develop HCISTA to improve the efficiency and flexibility of classical ISTA (with pre-computed parameters) without compromising the convergence rate in theory. Furthermore, the DNN-based hybrid algorithm is generalized to popular variants of learned ISTA, dubbed HLISTA, to enable a free architecture of learned parameters with a guarantee of linear convergence. To our best knowledge, this paper is the first to provide a convergence-provable framework that enables free-form DNNs in ISTA-based unfolded algorithms. This framework is general to endow arbitrary DNNs for solving linear inverse problems with convergence guarantees. Extensive experiments demonstrate that hybrid ISTA can reduce the reconstruction error with an improved convergence rate in the tasks of sparse recovery and compressive sensing.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/29/2018

Theoretical Linear Convergence of Unfolded ISTA and its Practical Weights and Thresholds

In recent years, unfolding iterative algorithms as neural networks has b...
research
11/24/2022

Deep unfolding as iterative regularization for imaging inverse problems

Recently, deep unfolding methods that guide the design of deep neural ne...
research
09/28/2022

Algorithm Unfolding for Block-sparse and MMV Problems with Reduced Training Overhead

In this paper we consider algorithm unfolding for the Multiple Measureme...
research
05/30/2016

Tradeoffs between Convergence Speed and Reconstruction Accuracy in Inverse Problems

Solving inverse problems with iterative algorithms is popular, especiall...
research
02/07/2019

Speeding up scaled gradient projection methods using deep neural networks for inverse problems in image processing

Conventional optimization based methods have utilized forward models wit...
research
02/13/2023

Reliability Assurance for Deep Neural Network Architectures Against Numerical Defects

With the widespread deployment of deep neural networks (DNNs), ensuring ...
research
10/26/2020

Learning Fast Approximations of Sparse Nonlinear Regression

The idea of unfolding iterative algorithms as deep neural networks has b...

Please sign up or login with your details

Forgot password? Click here to reset