Connecting Hamilton–Jacobi partial differential equations with maximum a posteriori and posterior mean estimators for some non-convex priors

04/22/2021 ∙ by Jérôme Darbon, et al. ∙ 0

Many imaging problems can be formulated as inverse problems expressed as finite-dimensional optimization problems. These optimization problems generally consist of minimizing the sum of a data fidelity and regularization terms. In [23,26], connections between these optimization problems and (multi-time) Hamilton–Jacobi partial differential equations have been proposed under the convexity assumptions of both the data fidelity and regularization terms. In particular, under these convexity assumptions, some representation formulas for a minimizer can be obtained. From a Bayesian perspective, such a minimizer can be seen as a maximum a posteriori estimator. In this chapter, we consider a certain class of non-convex regularizations and show that similar representation formulas for the minimizer can also be obtained. This is achieved by leveraging min-plus algebra techniques that have been originally developed for solving certain Hamilton–Jacobi partial differential equations arising in optimal control. Note that connections between viscous Hamilton–Jacobi partial differential equations and Bayesian posterior mean estimators with Gaussian data fidelity terms and log-concave priors have been highlighted in [25]. We also present similar results for certain Bayesian posterior mean estimators with Gaussian data fidelity and certain non-log-concave priors using an analogue of min-plus algebra techniques.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 9

page 10

page 11

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.