Sequential Projected Newton method for regularization of nonlinear least squares problems
We develop an efficient algorithm for the regularization of nonlinear inverse problems based on the discrepancy principle. We formulate the problem as an equality constrained optimization problem, where the constraint is given by a least squares data fidelity term and expresses the discrepancy principle. The objective function is a convex regularization function that incorporates some prior knowledge, such as the total variation regularization function. Using the Jacobian matrix of the nonlinear forward model, we consider a sequence of quadratically constrained optimization problems that can all be solved using the Projected Newton method. We show that the solution of the sub-problem results in a descent direction for an exact merit function. This merit function can then be used to describe a formal line-search method. We also formulate a slightly more heuristic approach that simplifies the algorithm and results in a significant computational improvement. We illustrate the robustness and effectiveness of our approach using a number of numerical experiments. We consider Talbot-Lau x-ray phase contrast imaging as application.
READ FULL TEXT