Accelerated PDE's for efficient solution of regularized inversion problems

09/30/2018
by   Minas Benyamin, et al.
0

We further develop a new framework, called PDE Acceleration, by applying it to calculus of variations problems defined for general functions on R^n, obtaining efficient numerical algorithms to solve the resulting class of optimization problems based on simple discretizations of their corresponding accelerated PDE's. While the resulting family of PDE's and numerical schemes are quite general, we give special attention to their application for regularized inversion problems, with particular illustrative examples on some popular image processing applications. The method is a generalization of momentum, or accelerated, gradient descent to the PDE setting. For elliptic problems, the descent equations are a nonlinear damped wave equation, instead of a diffusion equation, and the acceleration is realized as an improvement in the CFL condition from Δ t∼Δ x^2 (for diffusion) to Δ t∼Δ x (for wave equations). We work out several explicit as well as a semi-implicit numerical schemes, together with their necessary stability constraints, and include recursive update formulations which allow minimal-effort adaptation of existing gradient descent PDE codes into the accelerated PDE framework. We explore these schemes more carefully for a broad class of regularized inversion applications, with special attention to quadratic, Beltrami, and Total Variation regularization, where the accelerated PDE takes the form of a nonlinear wave equation. Experimental examples demonstrate the application of these schemes for image denoising, deblurring, and inpainting, including comparisons against Primal Dual, Split Bregman, and ADMM algorithms.

READ FULL TEXT

page 25

page 27

page 28

page 29

page 34

page 35

research
02/21/2023

Deep Learning via Neural Energy Descent

This paper proposes the Nerual Energy Descent (NED) via neural network e...
research
11/27/2017

Accelerated Optimization in the PDE Framework: Formulations for the Active Contour Case

Following the seminal work of Nesterov, accelerated optimization methods...
research
05/24/2021

Operator-splitting schemes for degenerate conservative-dissipative systems

The theory of Wasserstein gradient flows in the space of probability mea...
research
07/01/2023

Accelerated primal-dual methods with enlarged step sizes and operator learning for nonsmooth optimal control problems

We consider a general class of nonsmooth optimal control problems with p...
research
06/15/2015

Optimising Spatial and Tonal Data for PDE-based Inpainting

Some recent methods for lossy signal and image compression store only a ...
research
04/04/2018

Accelerated Optimization in the PDE Framework: Formulations for the Manifold of Diffeomorphisms

We consider the problem of optimization of cost functionals on the infin...

Please sign up or login with your details

Forgot password? Click here to reset