Variational Regularization Theory Based on Image Space Approximation Rates
We present a new approach to convergence rate results for variational regularization. Avoiding Bregman distances and using image space approximation rates as source conditions we prove a nearly minimax theorem showing that the modulus of continuity is an upper bound on the reconstruction error up to a constant. Applied to Besov space regularization we obtain convergence rate results for 0,2,q- and 0,p,p-penalties without restrictions on p,q∈ (1,∞). Finally we prove equivalence of Hölder-type variational source conditions, bounds on the defect of the Tikhonov functional, and image space approximation rates.
READ FULL TEXT