
Optimal Convergence of the Discrepancy Principle for polynomially and exponentially illposed Operators under White Noise
We consider a linear illposed equation in the Hilbert space setting und...
read it

Estimating the division rate from indirect measurements of single cells
Is it possible to estimate the dependence of a growing and dividing popu...
read it

Regularising linear inverse problems under unknown nonGaussian white noise
We deal with the solution of a generic linear inverse problem in the Hil...
read it

Inverse learning in Hilbert scales
We study the linear illposed inverse problem with noisy data in the sta...
read it

Optimalorder convergence of Nesterov acceleration for linear illposed problems
We show that Nesterov acceleration is an optimalorder iterative regular...
read it

A priori and a posteriori error analysis of the CrouzeixRaviart and Morley FEM with original and modified righthand sides
This article on nonconforming schemes for m harmonic problems simultaneo...
read it

Method comparison with repeated measurements  PassingBablok regression for grouped data with errors in both variables
The PassingBablok and TheilSen regression are closely related nonpara...
read it
Increasing the relative smoothness of stochastically sampled data
We consider a linear illposed equation in the Hilbert space setting. Multiple independent unbiased measurements of the right hand side are available. A natural approach is to take the average of the measurements as an approximation of the right hand side and to estimate the data error as the inverse of the square root of the number of measurements. We calculate the optimal convergence rate (as the number of measurements tends to infinity) under classical source conditions and introduce a modified discrepancy principle, which asymptotically attains this rate.
READ FULL TEXT
Comments
There are no comments yet.