On extension of the data driven ROM inverse scattering framework to partially nonreciprocal arrays

12/17/2021
by   Vladimir Druskin, et al.
Schlumberger
0

Data-driven reduced order models (ROMs) recently emerged as powerful tool for the solution of inverse scattering problems. The main drawback of this approach is that it was limited to the measurement arrays with reciprocally collocated transmitters and receivers, that is, square symmetric matrix (data) transfer functions. To relax this limitation, we use our previous work [14], where the ROMs were combined with the Lippmann-Schwinger integral equation to produce a direct nonlinear inversion method. In this work we extend this approach to more general transfer functions, including those that are non-symmetric, e.g., obtained by adding only receivers or sources. The ROM is constructed based on the symmetric subset of the data and is used to construct all internal solutions. Remaining receivers are then used directly in the Lippmann-Schwinger equation. We demonstrate the new approach on a number of 1D and 2D examples with non-reciprocal arrays, including a single input/multiple outputs (SIMO) inverse problem, where the data is given by just a single-row matrix transfer function.

READ FULL TEXT VIEW PDF

Authors

page 10

page 11

page 12

01/28/2021

Lippmann-Schwinger-Lanczos algorithm for inverse scattering problems

Data-driven reduced order models (ROMs) are combined with the Lippmann-S...
10/29/2019

Reduced Order Model Approach to Inverse Scattering

We study an inverse scattering problem for a generic hyperbolic system o...
06/21/2021

Data completion algorithms and their applications in inverse acoustic scattering with limited-aperture backscattering data

We introduce two data completion algorithms for the limited-aperture pro...
05/23/2022

Windowed Green function method for wave scattering by periodic arrays of 2D obstacles

This paper introduces a novel boundary integral equation (BIE) method fo...
02/01/2022

Imaging based on Compton scattering: model-uncertainty and data-driven reconstruction methods

The recent development of scintillation crystals combined with γ-rays so...
09/13/2019

Reduced order models for spectral domain inversion: Embedding into the continuous problem and generation of internal data

We generate data-driven reduced order models (ROMs) for inversion of the...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1 Introduction

In this work we extend the reduced order model (ROM) approach which was used previously for inverse impedance, scattering and diffusion [borcea2011resistor, borcea2014model, druskin2016direct, druskin2018nonlinear, borcea2017untangling, borcea2019robust, BoDrMaMoZa, borcea2020reduced, DrMoZa, Borcea2021ReducedOM] to nonsymmetric data sets. Here, we consider the inverse diffusion problem given spectral data, although the technique to handle non-symmetric transfer functions can be applied to a much wider class of problems.

In the ROM framework for solving multidimensional inverse problems, the ROM is constructed from a symmetric data set, that is, for coinciding source and receiver pairs. That is, the ROM is chosen precisely to match this data set, see [druskin2016direct, druskin2018nonlinear, borcea2017untangling, borcea2019robust, BoDrMaMoZa, Borcea2021ReducedOM, borcea2021reduced]. Then, the ROM is transformed to a sparse form (tridiagonal for single input single output (SISO) problems, block tridiagonal for multiple input/output (MIMO) problems) by Lanczos orthogonalization. The data-driven ROM, in this orthogonalized form, can be viewed as a discrete network which has entries for which their dependence on the unknown pde coefficients is approximately linear [borcea2011resistor, druskin2016direct, borcea2014model, borcea2020reduced, Borcea2021ReducedOM]. That is, the main nonlinearity of the inverse problem was absorbed by the orthogonalization process. This process is related to the seminal works of Marchenko, Gelfand, Levitan and Krein on inverse spectral problems, and to the idea of spectrally matched second order staggered finite-difference grids first introduced in [druskin1999gaussian] and first used for direct inversion in [BoDr].

The data-driven ROM can be viewed as a projection operator, or Galerkin system [doi:10.1137/1.9781611974829.ch7]. A crucial property, which was first noticed in [druskin2016direct], is that the Galerkin basis corresponding to the unknown coefficient is very close to the one from the homogeneous problem. This led to the back projection algorithm [druskin2018nonlinear], which allowed for direct reconstructions in multiple dimensions. In [BoDrMaMoZa], it was found that thanks to this weak dependence of the basis functions on the unknown coefficients, the ROM can also be directly used to generate internal solutions from boundary data only.

In the work [DrMoZa], the data-generated internal solutions (corresponding to unknown coefficient ) introduced in [BoDrMaMoZa] were used in the Lippmann-Schwinger integral equation framework. If we consider the data for the unknown coefficient and background data corresponding to a known background, then the Lippmann-Schwinger integral equation with respect to the unknown can be written as

(1)

where is the continuous inner product on the PDE domain, and where and are the unknown and background internal solutions respectively. For the Lippmann-Schwinger Lanczos integral equation (LSL IE), we use the data generated internal solution in place of :

(2)

Since is precomputed directly from the data without knowing , (2) becomes linear with respect to .

The Lippmann-Schwinger-Lanczos approach resolves directly one of the main limitations of the above mentioned works; that it applies only to symmetric data sets. This is the subject of this manuscript. Indeed, while the ROM must still be constructed from a symmetric subset of the data, the Lippmann-Schwinger Lanczos equation (2) can be directly applied to any additional receivers, using the same internal solutions. Furthermore, additional sources (with receivers in the symmetric part) can also be added thanks to reciprocity. Adding this additional data increases the range of the Lippmann-Schwinger-Lanczos linear operator on the r.h.s. of (2), thus improving the quality of the solution of the inverse problem.

The structure of our extended multi-input/multi-output (MIMO) measurement array can be summarized with the help of (3). The column and row numbers correspond respectively to the indices of the receivers (outputs) and transmitters (inputs). The conventional data-driven ROM requires collocated receiver and transmitter arrays (the symmetric data set), so the measurements in this case are given by a square matrix, that is, the left upper block of (3). We assume reciprocity, implying symmetric transfer functions, thus it is sufficient just to measure their upper triangular parts. The extended receivers are shown in the upper right block. A common case is when all added receivers measure responses from one of the transmitters, resulting in adding a column. However, added receivers can potentially just measure responses from certain transmitters, resulting in an arbitrary sparsity pattern in the upper right block. Likewise, one can add transmitters by adding elements to the lower left block. By reciprocity, they can be symmetrically reflected into the right upper one.

(3)

This paper is organized as follows. In Section 2 we describe the entire process in detail for a one dimensional single input multiple output (SIMO) problem. We briefly describe the construction of the ROM from the single input/ single output (SISO) part of data, the Lanczos orthogonalization process and the generation of the internal solutions. We then show its use in the Lippmann-Schwinger equation for the full SIMO data. The generalization of this process to SIMO or nonsymmetric MIMO arrays in higher dimensions is described in Section 3. Section 4 contains numerical experiments.

2 One dimensional SIMO problem

We begin this work with a one dimensional problem to demonstrate the approach. In the first subsection we describe the problem setup one source and two receivers, one of which is at the source. We note the transfer function for this problem has the SISO transfer function as a component, which we use to construct the ROM. We review this briefly in the second subsection, along with tridiagonalization of the ROM and generation of the internal solutions. In the last subsection we show how to use the full transfer function and the internal solutions in the Lippmann-Schwinger equation in order to solve the fully nonlinear SIMO inverse problem.

2.1 Description of the SIMO problem

We start by considering the following single input multiple output (SIMO) inverse problem in one dimension

(4)

where . The source is assumed to be a compactly supported real distribution localized near the origin, for example, roughly speaking, with small . We also use a similar distribution near the right boundary to represent the second receiver. Consider , , since for nonnegative the above resolvent is well defined for off of the negative real axis. We note that the above formulation can be obtained via the Laplace transform of the one dimensional diffusion or wave problem. The SIMO transfer function is then

(5)
(6)

where throughout the paper we use to denote the continuous Hermitian inner product

which in the one dimensional case is .

For simplicity of exposition we consider real spectral data points that is, we consider the data

(7)

In this case all solutions are real and the conjugates are unnecessary. For complex data points, all of the following holds, see [DrMoZa]. The SIMO inverse problem is then to determine in (4) from the data (7).

2.2 Construction of the data-driven ROM, orthogonalization and internal solutions

The first component of the transfer function (5) is a SISO transfer function,

is the symmetric portion of , and can be used to construct the ROM exactly as in previous works e.g. [DrMoZa]. We describe this now briefly. The exact solutions to (4) , , form a basis for the projection subspace

We define the data-driven ROM as the Galerkin system corresponding to

(8)

where are symmetric positive definite matrices with the stiffness matrix given by

and mass matrix given by

The right hand side

is a column vector with components

and the Galerkin solution for the system is determined by the vector depending on . Note that corresponds to a column vector of coefficients of the solution with respect to the above basis of exact solutions. The matrices and can be obtained directly from the data, without knowing the exact solutions, from the formulas

(9)

and

(10)

Due to the matching conditions, the ROM transfer function corresponding to this Galerkin system

matches (the symmetric part of) the data exactly. This is well known; a proof of this for real is in [BoDrMaMoZa], and for complex in [DrMoZa]. Furthermore, for any , the solution to (4) is close to its Galerkin projection

where represents the row vector of basis functions ,

Next we perform a change of basis, that is, we orthogonalize by using the Lanczos algorithm. More precisely, we run steps of the -symmetric Lanczos algorithm corresponding to matrix and initial vector . This yields tridiagonal matrix and -orthonormal Lanczos vectors , such that

(11)

where

and

The new basis is orthonormal in and is given by the vector

The Galerkin solutions and transfer function can then be written in this new basis as

(12)
(13)

where is the first coordinate column vector in .

Now, we use this orthogonalized ROM to produce internal solutions. Of course, we do not know , so we don’t know the original basis of exact solutions . What we do instead is to replace the unknown orthogonalized internal solutions with orthogonalized background solutions corresponding to background . Here is the row vector of background solutions

to (4) corresponding to and the same spectral points . A ROM for this background problem is computed in the same way, and is computed from its Lanczos orthogonalization. That is, one can compute an approximation to the unknown internal solution using

(14)

which is obtained from data only.

2.3 Nonlinear inverse problem

We now use the Lippmann-Schwinger formulation to solve the nonlinear inverse problem for all of the data, not just the symmetric part, which is the SISO transfer function in this case. From (5) and its background counterpart we obtain the Lippmann-Schwinger equation

(15)

for where is the solution to the background problem

(16)

with zero Neumann conditions. Correspondingly, is the background transfer function

(17)

For real we then have equations

(18)
(19)

for . If we put the background solutions in the vector

we can write (18) and (19) as

(20)
(21)

for . As usual, the internal solutions and their derivatives with respect to are unknown, and depend on , so the system (15-18) is nonlinear with respect to . Now, just as in [DrMoZa] we replace in (15-18) with its approximation

which we obtained from the ROM constructed from the symmetric part of the data. We can write the new system for as

(22)

where

and

is a -dimensional vector of functions on . Recall is computed directly from the symmetric part of the data without knowing , by using (14), making nonlinear system (22) linear. We continue to refer to (22) as a Lippmann-Schwinger-Lanczos system. Setup (22) corresponds to the full array in (3).

3 Multidimensional MIMO problem

In this section we consider a more general multidimensional case where we may have any number of sources and receivers. The symmetric part of the transfer function will consist of all of the data from coinciding source/receiver pairs, that is, it corresponds to the left upper block of (3). Again, the symmetric part will be used to build the ROM as in [DrMoZa], while the remaining data will be used later in the Lippmann-Schwinger formulation. We will assume that the nonsymmetric, or remaining part of the transfer function consists of receivers, that is, they will be located in the right upper block of (3). As it was already mentioned in the introduction, this is without loss of generality, since any remaining sources can equivalently be viewed as receivers by reciprocity.

3.1 Description of the MIMO problem

We consider a formulation that can be obtained via the Laplace transform of the diffusion or wave problem, which can be written as the following boundary value problem on :

(23)

where are localized sources, e.g., boundary charge distributions, supported near or at an accessible part of . Consider also a set of distributions representing our receivers, which will include all of the sources distributions. That is, let

where . Let

be our solution functions corresponding to the sources. Then the multiple-input multiple output (MIMO) transfer function is a matrix valued function of

(24)

where represents the continuous inner product, that is,

is matrix valued with

We consider the inverse problem with data given by real symmetric matrices, that is,

and

for real spectral points , while we note again that complex data will follow in the same way [DrMoZa].

3.2 Construction of the data-driven ROM, block Lanczos, and internal solutions

We now construct the ROM based on the symmetric part of the transfer function:

and correspondingly define

We consider the dimensional projection subspace

and define the MIMO data-driven ROM as the system

(25)

where are symmetric positive definite matrices, is the symmetric part of the data

and the system solution is a matrix valued function of , again corresponding to coefficients of the solution with respect to the above basis of exact solutions. Stiffness and mass matrices are block versions of (8), with blocks given by

and

Again and are obtained by imposing the conditions that the ROM transfer function matches the data. Next, we perform block Lanczos tridiagonalization, with matrix

and initial block vector . From this we obtain the block-tridiagonal matrix

with blocks, and block-vectors which are orthonormal with respect to the inner product, which again corresponds to continuous orthonormality. From this we obtain the block counterpart of (11)

where

The Galerkin projection of the true solution or state solution can then be written in the new Lanczos basis as

(26)

where consists of the first

columns of identity matrix

. We express the orthogonalized basis as

where are the blocks of matrix and are the corresponding vectors of solutions. Note that this basis depends on the unknown internal solutions, so as before we replace the basis in (26) with its background counterpart . This yields the data generated vector of internal solutions

(27)

3.3 Nonlinear MIMO inverse problem

We now solve the fully nonlinear inverse problem using the entire non-symmetric data set. From (24) and the equation for the background solutions we obtain the Lippman-Schwinger formulation

(28)

Here represents all background solutions with . For we will have

(29)

Now we replace with its approximation in (28) and (3.3). Again, precomputing from the data only (27) will yield the linear system for , which is given by the upper triangular part of (30) and (31):

(30)
(31)

This setup corresponds to the full upper triangular (3) . However, as already mentioned, removal of any element of the data matrix with the column number larger than K does not affect the computation of , so we can use any sparsity patterns in the right upper blocks of and . Note that these patterns can also vary for different matching frequencies. This would result in the reduction of the range of the forward linear operator without changing the remaining columns.

4 Numerical experiments

4.1 One dimensional SIMO problem

We first demonstrate numerical results here for a one dimensional reconstruction problem as a proof of concept. The setup is exactly as in Section 2, although computations are done on a square, using finite elements and elements to compute the synthetic data. The LSL system is solved here using the background solutions as a basis for the unknown . We use positive spectral values . Note in Figure 1 that the SISO data, which is used to create the internal solutions, reconstructs the bar closer to the source quite well. Using the same internal solutions, we add data from the receiver on the right hand side, and we see that the second bar is captured. The Born approximation does not work well for this high contrast example.

Figure 1: True one dimensional medium (top left) and its reconstructions using SIMO data and the Born approximation (top right), Lippmann Schwinger Lanczos using SISO data only (bottom left), and Lippmann Schwinger Lanczos using SIMO data (bottom right)

4.2 Two dimensional examples

Here we show our numerical reconstructions of a 2D two-bump media. We assumed that the data is noiseless. The construction of the data-driven ROM may become unstable in the presence of noise, however, one can regularize it using the SVD-based algorithm of [borcea2019robust]. For simplicity, we omitted that part in this paper. We chose frequencies that provide enough sensitivity to recover the unknown scatterers, but so that the data is still non-redundant. In our first experiment, we considered the medium shown on the top left in Fig.2. The data acquisition setup mimics one from surface geophysical exploration, that is, we consider sources receivers on the top boundary of the domain. We use positive spectral values . The forward problem was discretized on a regular triangular grid using a finite-element method with elements. Equation (30) was approximated using nodal quadrature on a

grid. The obtained linear system is ill-conditioned, and we solved it via projecting onto its dominant eigenvectors. On the top right of Fig.

2 we plotted the reconstruction using the LSL IE for the square data matrix when . Red crosses and green circles on plots show source(s) and receiver(s) locations, respectively. Then, we expanded the data matrix to the rectangular one by adding 10 receivers. For this scenario, the reconstruction using Born linearization, i.e. when in (3.3) is replaced by with ), is shown on bottom left of Fig.2. For the same data, the image produced by the extended LSL IE is plotted on the bottom right of Fig.2. As one can observe, it improves the image compared to both Born and the square LSL IE. In fact, in Fig.3 we plotted the solutions of (3.3) assuming the exact internal solution is available, which we call ”Cheated IE”. This represents the best possible result if one were to apply an iterative (aka distorted) Lippmann-Schwinger algorithm which converged. (Iterative Lippman-Schwinger would require multiple solutions of forward problems, which can be costly even if such an algorithm converges.) The cheated reconstructions are identical to the corresponding results for the LSL IE in Fig.2, and, obviously, in both cases improvements are obtained compared to the rectangular data just because of the additional receivers.

In the second experiment we considered a SIMO medical imaging setup with source and receivers located on all four sides of square domain (see top left in Fig.4) and compared it to SISO scenario . We discretized the problem using finite elements on a uniform triangular grid with elements. Equation 30 was approximated using quadrature on grid. In this example we considered spectral values , that is, we added a negative one to the set from the previous example. In Fig.4 we plotted the images obtained using the LSL IE for SIMO (top right) and SISO (bottom right) scenarios as well as the Born reconstruction for SIMO (bottom left). In this case the SISO LSL image is qualitatively wrong. Both SIMO LSL IE and born images are qualitatively correct, however the LSL IE image is quantitatively more accurate. To strengthen this argument we compared a slice (the red line in Fig 4 top left ) of the true with its Born and LSL IE SIMO images in Fig. 5.

Figure 2: Experiment 1: True medium (top left) and its reconstructions using the LSL IE for square data (top right), Born linearization for rectangular data (bottom left) and extended LSL IE for the same rectangular data(bottom right)
Figure 3: Experiment 1: LSL IE image with exact internal solution (Cheated IE) for square data (left) and for rectangular data (right)
Figure 4: True p with its LSL SIMO, Born SIMO, and LSL SISO images. The SIMO reconstructions are qualitatively correct, while the SISO reconstruction is not. The LSL SIMO (IE) is a quantitatively better approximation to the true than Born.
Figure 5: Slices of true , the SIMO Born and the SIMO (LSL) IE. The LSL IE reconstruction is obviously closer to the true model.

5 Conclusion and discussion

Using the previously developed Lippmann-Schwinger-Lanczos (LSL) algorithm, we are able to extend successfully the model reduction approach to arrays with nonreciprocal subsets, including SIMO data sets. The main novelty of this algorithm is the exploitation of the product structure of the Lippmann-Schwinger equation. It allows us to plug in an approximate internal solution computed using only the data appropriate for the ROM construction (symmetric) subset of the data, and then using all of the given data to solve the enlarged linear LSL system. Even though all of the derivations here are made for the second order Schrodinger equation in the Laplace domain, and assuming exact data, we plan to extended them to first order systems and the time domain, as well as regularized noisy ROM formulations of [borcea2019robust, borcea2020reduced]. Future directions also include using iterative methods for very sparse data sets. We have already obtained promising preliminary results for the monostatic (SAR) problem, which is an important case of such data sets.

Acknowledgements We are grateful Liliana Borcea, Alex Mamonov and Jörn Zimmerling for productive discussions that inspired this research. V. Druskin was partially supported by AFOSR grant FA 955020-1-0079 and NSF grant DMS-2110773. S. Moskow was partially supported by NSF grants DMS-1715425 and DMS-2008441.

References