A Novel Learnable Gradient Descent Type Algorithm for Non-convex Non-smooth Inverse Problems

03/15/2020
by   Qingchao Zhang, et al.
0

Optimization algorithms for solving nonconvex inverse problem have attracted significant interests recently. However, existing methods require the nonconvex regularization to be smooth or simple to ensure convergence. In this paper, we propose a novel gradient descent type algorithm, by leveraging the idea of residual learning and Nesterov's smoothing technique, to solve inverse problems consisting of general nonconvex and nonsmooth regularization with provable convergence. Moreover, we develop a neural network architecture intimating this algorithm to learn the nonlinear sparsity transformation adaptively from training data, which also inherits the convergence to accommodate the general nonconvex structure of this learned transformation. Numerical results demonstrate that the proposed network outperforms the state-of-the-art methods on a variety of different image reconstruction problems in terms of efficiency and accuracy.

READ FULL TEXT

page 15

page 18

page 19

page 20

research
07/22/2020

Learnable Descent Algorithm for Nonsmooth Nonconvex Image Reconstruction

We propose a general learning based framework for solving nonsmooth and ...
research
11/24/2022

Deep unfolding as iterative regularization for imaging inverse problems

Recently, deep unfolding methods that guide the design of deep neural ne...
research
10/07/2021

Gradient Step Denoiser for convergent Plug-and-Play

Plug-and-Play methods constitute a class of iterative algorithms for ima...
research
08/16/2018

On the Convergence of Learning-based Iterative Methods for Nonconvex Inverse Problems

Numerous tasks at the core of statistics, learning and vision areas are ...
research
12/16/2019

Dense Recurrent Neural Networks for Inverse Problems: History-Cognizant Unrolling of Optimization Algorithms

Inverse problems in medical imaging applications incorporate domain-spec...
research
11/17/2019

Extra Proximal-Gradient Inspired Non-local Network

Variational method and deep learning method are two mainstream powerful ...
research
06/20/2017

First Order Methods beyond Convexity and Lipschitz Gradient Continuity with Applications to Quadratic Inverse Problems

We focus on nonconvex and nonsmooth minimization problems with a composi...

Please sign up or login with your details

Forgot password? Click here to reset