Bilevel Optimization, Deep Learning and Fractional Laplacian Regularization with Applications in Tomography

07/22/2019
by   Harbir Antil, et al.
0

In this work we consider a generalized bilevel optimization framework for solving inverse problems. We introduce fractional Laplacian as a regularizer to improve the reconstruction quality, and compare it with the total variation regularization. We emphasize that the key advantage of using fractional Laplacian as a regularizer is that it leads to a linear operator, as opposed to the total variation regularization which results in a nonlinear degenerate operator. Inspired by residual neural networks, to learn the optimal strength of regularization and the exponent of fractional Laplacian, we develop a dedicated bilevel optimization neural network with a variable depth for a general regularized inverse problem. We also draw some parallels between an activation function in a neural network and regularization. We illustrate how to incorporate various regularizer choices into our proposed network. As an example, we consider tomographic reconstruction as a model problem and show an improvement in reconstruction quality, especially for limited data, via fractional Laplacian regularization. We successfully learn the regularization strength and the fractional exponent via our proposed bilevel optimization neural network. We observe that the fractional Laplacian regularization outperforms total variation regularization. This is specially encouraging, and important, in the case of limited and noisy data.

READ FULL TEXT

page 6

page 14

page 15

research
11/01/2021

A general fractional total variation-Gaussian (GFTG) prior for Bayesian inverse problems

In this paper, we investigate the imaging inverse problem by employing a...
research
01/10/2020

Parameter learning and fractional differential operators: application in image regularization and decomposition

In this paper, we focus on learning optimal parameters for PDE-based ima...
research
11/06/2020

Discretization of learned NETT regularization for solving inverse problems

Deep learning based reconstruction methods deliver outstanding results f...
research
03/04/2017

Convex Geometry of the Generalized Matrix-Fractional Function

Generalized matrix-fractional (GMF) functions are a class of matrix supp...
research
05/22/2022

A preconditioned deepest descent algorithm for a class of optimization problems involving the p(x)-Laplacian operator

In this paper we are concerned with a class of optimization problems inv...
research
01/27/2021

Anti-Aliasing Add-On for Deep Prior Seismic Data Interpolation

Data interpolation is a fundamental step in any seismic processing workf...
research
08/05/2021

Deep Neural Networks and PIDE discretizations

In this paper, we propose neural networks that tackle the problems of st...

Please sign up or login with your details

Forgot password? Click here to reset