Derivative Interpolating Subspace Frameworks for Nonlinear Eigenvalue Problems
We first consider the problem of approximating a few eigenvalues of a proper rational matrix-valued function closest to a prescribed target. It is assumed that the proper rational matrix-valued function is expressed in the transfer function form H(s) = C (sI - A)^-1 B, where the middle factor is large, whereas the number of rows of C and the number of columns of B are equal and small. We propose a subspace framework that performs two-sided projections on the state-space representation of H(·), commonly employed in model reduction and giving rise to a reduced transfer function. At every iteration, the projection subspaces are expanded to attain Hermite interpolation conditions at the eigenvalues of the reduced transfer function closest to the target, which in turn leads to a new reduced transfer function. We prove in theory that, when a sequence of eigenvalues of the reduced transfer functions converges to an eigenvalue of the full problem, it converges at least at a quadratic rate. In the second part, we extend the proposed framework to locate the eigenvalues of a general square large-scale nonlinear meromorphic matrix-valued function T(·), where we exploit a representation ℛ(s) = C(s) A(s)^-1 B(s) - D(s) defined in terms of the block components of T(·). The numerical experiments illustrate that the proposed framework is reliable in locating a few eigenvalues closest to the target point, and that, with respect to runtime, it is competitive to established methods for nonlinear eigenvalue problems.
READ FULL TEXT