Inverse obstacle scattering problems occur in many real-world applications such as radar, sonar and non-destructive testing . One main object of this kind of problems is to find the shape and location of the unknown obstacle, which usually can not be observed directly. Due to the widely applications, these prolems have attracted many researchers’ attention. Especially, a large number of papers put emphasis on designing some effect numerical algorithms to recover the unknown information according to some related observed data [1, 2]
. The observed data are contaminated by the noise inevitably. The Gaussian process has many theoretical and practical advantages since it can mimic the effect of many random processes. And therefore a lot of noise models can be characterized by the Gaussians. However, in many applications, the response variable of interest is a count, that is, takes on nonnegative integer values. For count data, it is well modeled by a Poisson process. The Poisson process is usually used to describe the occurrences of events and the time points at which the events occur in a given time interval, such as the occurrence of natural disasters and the arrival times of customers at a service center. In the obstacle scattering problem, the counts of photons at a remote closed surface are detected. As seen in[3, 9, 10], the counts of photons have interacted with the unknown object and can be well modeled by a Poisson process. In view of this, we consider to use the Poisson data to recover the unknown shape.
The stochastic nature of the data undoubtedly results in the uncertainty of the reconstruction. Therefore, besides the reconstruction of the unknown object itself, its uncertainty requires the extra concerns. Probabilistic thinking provides a natural way to quantify the uncertainty. Recently, Bayesian inference method has become a popular tool for this purpose
. In Bayesian approach, the main idea is to translate the prior knowledge on the noise and on the unknowns to prior probability laws and then use the forward model to transmit the prior information to the posterior distribution from which we can deduce any information about the unknowns. A rigorous Bayesian framework is developed for the inverse problems with the infinite dimensional unknowns
. From the existed works, we can see that the prior for the unknowns is crucial in Bayesian approach and even determines whether or not it works. A difficult subject is how to code the experience prior information in some distribution formula. It is well-known that, subject to certain mild conditions, the sum of a set of random variables, has a distribution that becomes increasingly Gaussian as the number of terms in the sum increases. This point allows us to use Gaussians to model the prior for many real physical parameters. In this paper, we characterize the unknown by two kinds of prior knowledges: the Gaussian and the hybrid priors. Upon these priors, we discuss the well-posedness of the posterior distribution with the Poisson data.
The remainder of this paper is organized in the following: In Section 2, we describe the exterior acoustic scattering problem. In Section 3, the Bayesian approach and the well-posedness are discussed. In the final, some numerical examples are given.
2 Acoustic obstacle scattering with Poisson data
For a perfect cylindrical conductor with smooth cross section , the scattering of polarized, transverse magnetic time harmonic electromagnetic waves is described by the Helmholtz equation
where is the wave number, the total field, the scattered field and the incident wave. We let , where is the incident direction. The following boundary conditions are imposed
The condition (2.2) corresponds to the sound soft boundary condtion and (2.3) is the so-called Sommerfeld radiation condition. The Sommerfeld radiation condition characterizes the outgoing wave and enables us to have the following asymptotic behaviour for the scattered wave :
where and is the far field pattern. The direct scattering problem is to find the scattered field for given incident field and obstacle . It is known that there exists a unique solution if is Lipschitz continuous .
We focus on the reconstruction of the unknown domain by some observed data: the photon counts of the scattered electromagnetic field far away from the obstacle. Since at large distances the photon density is approximately proportional to , the obstacle reconstruction problem can be described by the operator equation
where is a constant corresponding the noise level. We suppose that the domain is starlike, i.e., the boundary is parameterized by
In this setting, the function becomes the unknown and therefore the equation (2.5) is equivalent to
The operator equation can be given by the single-double layer potential theories . Let be of class . Define the single layer potential operator
and the double layer potential operator
where is the fundamental solution, is the Hankel function of the first kind of order zero and is normal direction. It can be seen in  that and are bounded from into . With the single-double layer potentials, the scattered field can be written as
where is the real coupling parameter and the unknown density function. Then the main goal for the direct scattering problem is to determine the unknown density function according to the sound soft boundary condition by
By (2.11), the far field pattern further can be written as
3 Bayesian inversion
Denote and . We consider the observation is a Poisson point process  in with parameters . The goal of statistical inversion is to explore the posterior distribution , i.e., the distribution of conditional on the value of . Bayes’ formula shows that
where is the prior distribution for the unknown and the likelihood function.
The Poisson noise yields the likelihood function
The prior is given before acquiring the data. Since the denotes the length, an additional restriction should be imposed: the unknown must be positive. For this purpose, we consider the following two ways [1, 10]
where and are constants and is the error function
These two ways guarantee that is strictly positive. With these parameterizations, we now infer the new unknown parameter . We still use to denote the prior distribution for . Without any confusion, the forward operator is still denoted by . The prior needs to be chosen carefully and it is not unique. We give different priors for (3.5) and (3.6).
For (3.5), the prior is given by the Gaussian
where is the covariance function and should be chosen carefully. In , a convenient prior to use is
where with the definition domain
In this prior assumption, it can be verified almost surely with , i.e., [1, 7]. In this prior assumption, the well-posedness of the posterior distribution for the Gaussian noise case has been discussed in .
where defines the characteristic length-scale. The covariance example can be viewed as mapping the one-dimensional input variable to the two-dimensional . The distance between and is given
The covariance kernel (3.10) can be obtained by taking the squared exponential kernel in space
where is the TV seminorm,
with a positive constant . It follows immediately that the posterior distribution has this form
Next, we discuss the well-posedness of the posterior distribution with the hybrid prior. This well-posedness is characterized in the sense of Hellinger metric, which is defined by
Here, we need to use the following Fernique theorem to some functionals are bounded for Gaussian measures.
Theorem 3.1 (Fernique).
If is a Gaussian measure on Banach space , so that , then there exists such that
For fixed , there exists such that
for all .
For every , there exists constants and such that, for all , and with ,
For convenient, denote
where is the inner product in . It is obvious from (3.6) that
Therefore, the boundedness of the forward operator implies that
for two constant vectors, . And it follows immediately that for a positive constant . Furthermore, we obtain for by the Cauchy-Schwarz inequality
It should be noted that the lower bound in (3.16) probably has component. In this case, we can make a shift for the forward operator by
where is a positive constant vector. Then we discuss the new problem of the form
To avoid the addition statement, we still use as the forward operator.
Assume that is the Gaussian measure with covariance function (3.10). For with , there exists such that
First, it should be pointed out that the posterior measure is absolute continuous about the prior measure . This will guarantee the existence of the Radon-Nikodym derivative . In fact, for any measurable subset , it holds that
Therefore implies that . Define
It is obvious that
By Lemma 3.3, we have
Since is the Gaussian measure, the unit ball has positive measure. This shows that is strictly positive. This together with this absolute continuous shows that the posterior distribution is well-defined. The Hellinger metric holds that
For , since
is bounded, it suffices to estimate the integral part. According to Lemma3.2 and Fernique therorem, we have
Since and are bounded from below, it follows that
4 Numerical test
In this section, we demonstrate some numerical examples to show the effectiveness of the Bayesian method for the inverse obstacle scattering problem with the Poisson data. We are ready to use the MCMC method to generate samples to explore the posterior distribution in (3.1).
It is clear that the operator in (3.9) has the eigen-system . With this, the prior for (3.5) can be generated by the Karhunen-Loève expansion. The truncated expansion is used to generate the prior samples. The corresponding derivatives can be computed accurately. For (3.6), the samples for with the covariance (3.10
) are period. We use the fast Fourier transform (FFT) to implement the numerical computations for the derivatives of the samples. The posterior samples for the two ways are both generated by the Metropolis-Hastings algorithm.
We just list the hybrid prior case in the following. For using (3.5) and the corresponding prior, the process is similar.
Initialize and further generate in (3.6).
For iteration do
Propose and further generate .
Accept or reject the proposal in the following way
Consider the following obstacles
In Fig. 1, we plot the Poisson data for . Using these data, we reconstruct the obstacles displayed in Fig. 2 and 3. The posterior sample means are used as the approximations. In Fig. 2, we plot the confidence interval for the peanut shape.
-  T. Bui-Thanh and O. Ghattas, An Analysis of Infinite Dimensional Bayesian Inverse Shape Acoustic Scattering and Its Numerical Approximation. SIAM/ASA Journal on Uncertainty Quantification 2, no. 1, 203-222, 2014.
-  D. Colton and R. Kress, Inverse Acoustic and Electromagnetic Scattering Theory (Third Edition), Springer, New York, 2013.
-  T. Hohage and F. Werner, Inverse problems with Poisson data: statistical regularization theory, applications and algorithms, Inverse Problems 32 (2016) 093001 (56pp).
-  Z. Li, Z. Deng and J. Sun, Limited aperture inverse scattering problems using Bayesian approach and extended sampling method, prepared.
C. Rasmussen and C. Williams, Gaussian Processes for Machine Learning, The MIT Press Cambridge, Massachusetts London, England, 2006.
-  A. Solin and S. Särkkä, Explicit link between periodic covariance functions and state space models,
-  A. Stuart, Uncertainty Quantification in Bayesian Inversion, https://homepages.warwick.ac.uk/ masdr/TALKS/stuartICM.pdf.
-  A. Stuart, Inverse problems: a Bayesian perspective, Acta Numerica, 19: 451-559, 2010.
-  F. Werner, Inverse problems with Poisson data: Tikhonov-type regularization and iteratively regularized Newton methods, Dissertation, Göttingen, 2011.
-  Q. Zhou, X. Zhang and J. Li, Bayesian inference and uncertainty quantification for image reconstruction with Poisson data,