Providing enhanced visualization (e.g. of an organ internal structures) during an intervention can significantly improve surgical procedures. Since each image modality provides different and often complementary information on tissue structures or deformation changes, the fusion of intra-operative and preoperative images into a unique coordinate frame adds significant value 14
. Registration of the preoperative image onto an intra-operative image (X-ray, Ultra Sound, Cone Beam CT, etc) can be handled in many different manners, given the application, the image modality, the parameter space, or the optimization process. The literature is vast on the topic and several surveys classify the different methods for this problem22. Regardless of the targeted clinical application, this paper focuses on vessel based registration approaches, with special interest into the type of deformation that is handled, the way the global displacement field is reconstructed, and their compatibility with the clinical work flow.
Feature-based registration methods require an identification of correspondences between the pre-operative and intra-operative images. Being present in many anatomical structures, vessels can be used as features for multi-modal image registration. As intrinsic and natural features, they eliminate the need for markers, making this solution more compatible with clinical constraints. More generally, graph-like structures are common in medical images and can be obtained from angiography or airway trees, to mention only some of them. Robustly registering such graphs is thus a key enabling technology for preoperative planing, intra-operative navigation, follow-up or group-wise analysis 8. However, the topology and shape of the vascular graphs is not consistent between patients 4, and, when considering intra-patient registration, variations in image quality also lead to very different graphs to be matched, making this problem quite challenging.
Several methods use euclidean or geodesic metrics of nodes to match graphs. Often, root nodes are manually matched 24, or a global rigid or affine alignment is first performed before using fine graph-matching techniques 9
. These methods encode nodes and edges similarity in an affinity matrix. Graph matching is then formulated as a quadratic assignment problem which maximizes a global consistency score based on 6, 1, 25. However, defining a node- or edge-similarity metric is not easy due to the non-linear deformation which may occur between the two acquisitions. In addition, due to segmentation inaccuracies, geodesic constraints may not be exactly satisfied. Topology changes may also appear. When graph nodes and edges don’t have distinctive features and only the vessel geometry is available, the methods based on local similarities are ineffective. The use of over-connected graph has thus been proposed9. This approach incorporates in the graph the edges between nodes connected by vessels within a neighborhood with a given radius. This compensates for topological inaccuracies or deformation of branches. However, robustness to large deformation is not guaranteed since nothing can ensure that the established correspondences are physically coherent with the elasticity of the organ. In addition, the process is dependent from the initial rigid registration.
Within a intra-operative context, the transformation model needs to efficiently and accurately describe high deformation anatomical properties 22. Computational efficiency has been achieved with simple models as thin-plate splines transformations 5, however this deformation is not necessarily physically coherent. Seradell et al. 20 proposed an approach which does not rely on local similarity but uses a deformation model to determine whether a new pair of matching is compatible with the current set of matching hypotheses. Using Gaussian Process Regression (GPR), this method iteratively generates hypotheses while the search space is refined. The search space defines the most likely correspondences to explore at every step. On several examples and without initialization, this method handled topological differences and deformation. Nonetheless, the GPR mapping can not handle large non-linear deformations and thus it may only find incomplete correspondences. Moreover, the matching time with large graphs does not satisfy intraoperative constraints. To efficiently match large graphs, a Monte Carlo tree search method was proposed 16. It explores new matches and extends good matches in a balanced way. Its efficiency relies in a path descriptor and an implicit transformation model that assume a rigid transformation with small nonlinear deformation. Thus, this method is not suitable for large deformations.
Finally, in order to have a more discriminative deformation model without a prohibitive computational cost at each iteration, Pinheiro et al 15
recently proposed to use a B-spline deformation model which is updated incrementally using a Kalman filter. While this process saves time, B-splines are not able to properly describe elastic properties of the organs. Generally speaking, the choice of a constitutive model is essential for the registration process when considering deformable organs. From the previous methods5, 20, 16, 15, the models used reach computational efficiency. However, physical consistency with organ properties, in particular heterogeneities and anisotropy of the tissues, is not guaranteed. To this end, more realistic, biomechanically-driven models have been proposed to solve elastic image intensity based registration problems 11. However, the combination of graph matching and biomechanics is a new direction.
A biomechanical graph matching method (BGM) which combines a GPR approach with a biomechanical model of the organ, as a mean to discard matching hypotheses which would lead to non-plausible deformations was proposed 3. However, just replacing the GPR by a biomechanical model is extremely costly in terms of computational time, as the physics-based simulation takes about three orders of magnitude longer than GPR. BGM is initialized with a GPR solution to reduce the matching time. This allowed to recover additional matches compatible with the elasticity of the organ even when large elastic deformations were considered. However, it was not robust to noise, only matched limited size graphs and the computation time was still above intra-operative constraints.
Several contributions with respect to BGM are presented in this article: i) the use of a more advanced biomechanical model to handle heterogeneities and anisotropy due to the vascularization; ii) the definition of a better and novel metric for generating improved graph-matching hypotheses, based on the notion of compliance, the inverse of the stiffness. iii) the generation of matching hypotheses using an adaptive bounded region. The first two contributions compose the Vessels Compliance Graph Matching method (VCGM), which reduces the matching search space and/or improves the registration accuracy. Moreover, the three contributions constitute the Adaptive Compliance Graph Matching method (ACGM), which further reduces the computation time by predicting first the most plausible matching hypotheses and reduces the sensitivity on the search space parameters. These contributions improve the registration quality and meet intra-operative timing constraints.
This article describes the mechanical model used and both novel methods (VCGM and ACGM). It also presents experiments on both synthetic and real data, including a sensitivity analysis highlighting the robustness and genericity of the compliance matching methods.
2 Materials and Methods
Two porcine liver datasets (PA and PB) were acquired, under institutionally approved animal ethics protocol, with contrast agent injection. This article does not contain patient data. For each pre- and intra-operative images, the vessels are segmented using a model-based tube detection filter combined with a region growing algorithm 21. Then from each segmentation, Dijkstra minimum-cost spanning trees are recursively constructed to obtain the vessels’ graph 17. Because the fully automatic vessels’ segmentation is often inaccurate and incomplete, especially in the case of the intra-operative images, the resulting graphs have important topological differences which must be correctly addressed by the matching algorithm.
The complete registration pipeline, shown in Fig. 1, consists of: (i) The improved graph matching algorithm (iGPR 3) that reduces significantly the matching time. The result of iGPR is usually incomplete, generally misses parts with big deformation. Thus, it is only used as a initialization for the next step. (ii) ACGM reduces the search space by using the compliance, while finding most of the bifurcations matches including those with large deformations. (iii) The fine alignment of graphs edges using the biomechanical model. The first and third parts were previously presented in Garcia-Guevara et al. 3. While, ACGM comprises the contributions described in the following sections.
2.1 Fast biomechanical model of vascularized liver
The original GPR matching 20
is relatively generic and, depending on the values of hyperparameters, may theoretically adapt to large range of deformations. Nevertheless, the deformations which occur in soft tissues during surgical manipulations display a high level of non-linearity due to their complex characteristics. Typically, internal structures such as vessels introduce heterogeneity and anisotropy which cannot properly be taken into account without a biomechanical model. In this work, the purpose of the biomechanical model is to compute (i) the transformation of the pre-operative data given the actual hypothesis, and (ii) the compliance which replaces the covariance in the original GPR matching algorithm.
The biomechanical model considered in this work is based on the co-rotational formulation of linear elasticity 10. Although this approach relies on a linear stress-strain relationship, it provides a good approximation of large deformations including rotations. Further, a composite FE approach which accounts for the mechanics of both of parenchyma and vessels 13, 18 is used to model vascularized tissue. The parenchyma is modeled with linear P1 tetrahedral elements where for each element , the local 1212 stiffness matrix is computed as
where and are, respectively, the strain-displacement and stress-strain matrices which remain constant during the simulation, and is a matrix composed of the element rotation matrix which depends on the actual displacement of the parenchyma mesh nodes thus introducing a non-linearity into the formulation 10. The vascular structures are modeled as trees composed of serially-linked Timoshenko beam elements, mimicking the biomechanics of hollow tubes parametrized with Young’s modulus, diameter and wall thickness.
The beam mechanics includes both positional and rotational degrees of freedom (DoF). Hence, each beam elementis modeled with a 1212 local stiffness matrix which depends on the actual displacements and orientations of the beam nodes 19, 2.
Despite the identical size, the element matrices and have a completely different structure: the former describes mechanics of an element given by 4 nodes each determined by 3 positional DoFs, while the latter determine the behavior of element having two nodes, each equipped with 3 positional and 3 rotational DoFs. The coupling mechanism 13 uses a mapping that defines a Jacobian matrix which is used to compute a composite stiffness matrix as
Given the biomechanical FE model represented by a global stiffness matrix assembled from composite local matrices , the transformation corresponding to a matching hypothesis given by pairs of bifurcations is computed as follows. As certain mesh nodes are generated to coincide with the set of source bifurcations , each bifurcation matching pair determines a non-homogeneous Dirichlet condition that drives the deformation model, thus describes the displacement of a node . A penalty method defines the Dirichlet conditions. This method is physically interpreted as adding very stiff linear springs to each node . These springs pull the nodes from its initial source position to the target position
. Since the local stiffness matrices of parenchyma and vascular elements depend on the actual displacement vector, the problem is non-linear, and the final equilibrium must be computed iteratively. A damped Newton-Raphson method is used: in each iteration , the update of nodal positions is computed by solving a system of linear equations
where is the global system matrix after imposition of non-homogeneous Dirichlet boundary conditions corresponding to the hypothesis , is a damping parameter, identity matrix having the identical size as and the vector of the right side gathers the internal elastic forces and prescribed displacements given by the hypothesis . The displacement vector is updated in each step of the method as . the equilibrium displacement obtained for hypothesis 12. Since the aim is to minimize the time needed for the computation of the transformation, preconditioning is used: In the first iteration of the Newton-Raphson method, the Eq. 3 is solved with an algorithm based on the sparse Cholesky decomposition111https://www.pardiso-project.org/. In the following iterations, preconditioned conjugate gradients are used to compute the update employing the decomposition constructed in the first iteration.
Besides the elastic transformation, the FE model is employed to obtain the compliance in the free source bifurcations which is used in both the VCGM and ACGM algorithms. According to the mathematical definition, compliance is defined as the inverse of the stiffness matrix. It is evaluated in an arbitrary node of the FE mesh where it is given by a 3
3 symmetric tensor
extracted from the global compliance matrix. Eigenvectors of the compliance tensor define the principal axes of an ellipsoid and the eigenvalues determine its scale along each axis. From the mechanical point of view, this ellipsoid characterizes the flexibility of, i. e., it is proportional to the volume to which the node can be displaced under constant unit force applied to the node in an arbitrary direction. The source bifurcations coincide with a mesh node hence its compliance corresponds to the block extracted from at the position indexed by . Since the proposed method is adapted to large deformations, it is necessary that the compliance is computed using the stiffness matrix computed using the equilibrium displacement vector . Therefore, the elastic transformation corresponding to hypothesis is computed before the is obtained for each free source bifurcation.
The compliance has been used in other applications to produce structures having desirable physical properties 7. Whereas, in the following matching algorithms the compliance defines an improved metric for the generation hypotheses.
2.2 Vessels Compliance Graph Matching (VCGM)
BGM 3 uses the GPR covariance to define the hypotheses search space and is able to find correct bifurcation matches even when large nonlinear deformations occur. However, the covariance produces large bounded regions, shown in Fig. 2.a, and large computation time. This makes the algorithm incompatible with intra-operative deployment.
To overcome this crucial limitation of BGM, the first improvement proposed is VCGM which is described in Algorithm 1. This algorithm matches a source graph to a target graph . Where , and , are the sets of source and target nodes (tree bifurcations). Similarly, and are the sets of source and target edges.
The algorithm is initialized with an incomplete hypothesis which is a set of matched bifurcations from the source and target graphs (line 1). In practice, does not include matches implying large deformations and is quickly obtained with iGPR. From this initialization, the algorithm generates a set of hypotheses recursively. The current hypothesis defines the free source and target bifurcations sets as and , respectively. Up to here, VCGM is similar to BGM.
VCGM’s contribution starts by using with the FE simulation to compute the transformation and the compliance at (line 3). Here, the and replace, respectively, the GPR mean and covariance, removing completely this dependency.
To find the next matching candidates (line 5) the Mahalanobis distance
computed with the compliance is used. Here, is equivalent to the virtual work needed to add a candidate match. The biomechanical simulation is important because transforms the source bifurcations close to the target ones and the beam model correctly simulates the deformation along the graph edges. In addition, the compliance tensor filters candidates that require high energy to match. Especially, the beam model introduces additional stiffness and anisotropy that leads to smaller compliance in thicker vessels. This reduces the bounded region of these rigid vessels while keeping higher compliance and bounded regions on thin flexible vessels. These compliance-based bounded regions are shown in Fig. 2.b.
For each free bifurcation in , the bounded region candidates are the target free bifurcations within Mahalanobis () or strict euclidean () distance thresholds (line 18).
From here, the algorithm is again similar to BGM. In line 19, are the target geodesic distances from the already established correspondences to a new match . Similarly, denotes the source geodesic distances. Thus, the potential candidates are the bifurcations for which target and source geodesic distances are similar. Then, the matching candidates found are the free bifurcations with the lowest number of potential candidates (line 21).
The algorithm continues (line 5) with the computation of the current hypothesis’s quality score 20. Then, in the nested loops the hypotheses are generated. First, every free bifurcation in the random permutation of the matching candidates is explored. Then, every target candidate () with its associated free source bifurcation from the random permutation of creates a new hypothesis (). The new hypothesis is used to recursively call the matching method until no more matches are found. Finally, the best quality score hypothesis is selected from all the explored ones.
2.2.1 Setting the Mahalanobis compliance threshold
In BGM, the covariance Mahalanobis threshold () specifies a level of confidence given by the GPR covariance. However, setting an optimal remains highly dependent on each dataset (e.g. level of deformation, noise). In VCGM, the threshold represents an upper bound of the work needed to match bifurcations. The optimal depends on the FE model, deformation magnitude, initialization and incremental hypotheses generation (depicted in Fig. 3).
To simplify the optimal setting of , it is defined as a function of the maximum incremental displacement’s magnitude :
where is obtained using the FE simulation at the initialized stage () by selecting the bifurcation whose compliance ellipsoid volume is the maximum, i.e. the most "flexible" bifurcation. Since the free bifurcations can move in any direction, the maximum incremental displacement is assumed isotropic with respect to the eigenvectors of .
The optimal remains dependent on the incremental hypotheses generation and the deformation magnitude. Still, it is assumed that is smaller than the initial maximum match’s deformation and bigger than the new matches’ displacements . In practice, setting (about 40% of the largest bifurcation’s deformation) allowed to match successfully several experiments, which have a wide range of initial maximum displacements.
Since, is larger than required in some recursive steps of the algorithm, using a constant unnecessarily increases the search space. ACGM alleviates this issue.
2.3 Matching based on adaptive Mahalanobis distance (ACGM)
In VCGM, only the bifurcations that require less than a given amount of work, bounded by the constant compliance Mahalanobis threshold , are matched. Although the compliance tensor filters some incorrect matches, this constant upper bound unnecessarily increases the search space. This is because the incremental matching (in Fig. 3) does not always need the constant upper bound at every recursive step.
Instead of setting a constant threshold, ACGM uses the range . As presented in the Algorithm 2, the rigid-to-soft approach starts by adding the bifurcation matches that require the least amount of work () and when no more matches are found, instead of exploring other alternative hypotheses, the work bound is gradually increased. Hence, the matches that require more work are gradually added until a maximum allowed work () is reached (lines 6 to 9).
In most cases, the rigid-to-soft strategy finds an appropriate set of correspondences before exploring an alternative matching path. Thus, when the exploration of the first matching path is finished, is reduced to save time (line 16).
can be set higher than the optimal value (overestimated) to guarantee that it covers a wide range of deformation, scale, or incremental exploration dependencies. Thanks to the rigid-to-soft approach, an overestimated produces only a small increase in computation time.
When there are outlier matches that require less work to be matched than the correct matches,
When there are outlier matches that require less work to be matched than the correct matches,is a critical parameter. In this specific case, should be large enough to include the correct matches.
2.3.1 Setting Mahalanobis compliance threshold range for ACGM method
ACGM also uses Eq. 5 to define the range as a function of . The initialization, FE model and help to define a correct . Therefore, even in the specific case of outliers that are not geodesic-filtered, is a less critical parameter. is similar to , they indirectly depend on deformation and scale because of the incremental hypotheses generation. The advantage of is that it is overestimated to cover a wide range of cases without a high increase in computation time. inlineinlinetodo: inline pourquoi the incrmental exploration cases? inlineinlinetodo: inline Imagine the case in Fig. 3 that the bifurcations match in step 9 were missing, then the incremental matching would change and then the Uth should be 15.5 to match the last bifurcation.
This section presents the case of augmenting intra-operative Cone Beam Computed Tomography (CBCT) images with preoperative computed tomography angiography (CTA) data. The expected benefit is an improved visualization of the patient’s tumor(s), vascular system and other internal relevant structures. CBCT is an imaging modality that is more available in the operating room than CTA or MRI. However, contrast to noise ratio in CBCT is about half than that in CTA. CBCT suffers motion artifacts, beam hardening, partial volume and ring effects 23. Thus, CBCT cannot image certain lesions nor complete anatomy. These deficiencies are compensated if preoperative data augment the intra-operative image. The fusion approach proposed is based on the matching of pre- and intra-operative vascular trees according to the methods presented above and evaluated on both synthetic and real data. All the results and computation times were obtained with a regular desktop computer (4GHz eight-core, 16 GB RAM).
3.1 Experiments on synthetic data
To evaluate the methods, synthetic (target) graphs that resemble CBCT were generated from a real vascular graph which was segmented from a CTA image. First, the original CTA graph was deformed using a realistic hyperelastic FE simulation that simulates the effect of the pneumoperitoneum on the liver 3. It is important to note that the FEM model used for matching is linear which makes the computation faster and uses a 1.5 kPa Young Modulus which is different from the ground truth simulations. From this deformed CTA graph, leaf vessels were iteratively removed until only 60% remained, as a way to mimic the partial graph usually segmented from CBCT images. Then, noisy bifurcations were added (50% or 80% of the bifurcations remaining from the previous step). An example of the synthetic data generated is shown in Fig. 4. The original CTA graph and the deformed, reduced, and noisy target graphs were used to evaluate and compare the matching methods presented in this article. The target registration error (TRE) is computed in the complete original graph, including the 40% of vessels that were removed (therefore the TRE cannot reach zero in these experiments).
3.1.1 Accuracy and search space size with increasing Mahalanobis thresholds
The first experiment evaluates the search space size and accuracy as a function of the bound region used. Matching was repeated 10 times per method with increasing Mahalanobis thresholds. , and were set without using Eq. 5. inlineinlinetodo: inlinewhat do you mean by directly set?inlineinlinetodo: inlineKind of explain in discussion These different matchings were only used to study the methods’ Mahalanobis threshold sensitivity. Increasing thresholds were tested in one synthetic dataset (28 source and 16 target bifurcations) with 4 different initializations (named with ). Initialization has 6 matches while the other initializations have 5 matches, that is why explores fewer hypotheses. The three evaluated methods use the same realistic Euclidean () and geodesic thresholds ().
For every method, as shown in Fig. 5, the TRE remains high when a small Mahalanobis threshold is used, but decreases as the threshold increases. However, if a high threshold is used the number of explored hypotheses and search time increases without a significant improvement of the TRE. Nevertheless, BGM’s search space increase is steeper than for VCGM or ACGM. ACGM does the best pruning of the search space and has the best TRE results with less dependency on the Mahalanobis threshold used.
3.1.2 Accuracy and search space size at the optimal Mahalanobis threshold
Using the previous experiment, every method is also compared at its optimal performance, i.e. the smallest Mahalanobis threshold at which the TRE reaches a minimum value. The mean and standard deviation for 4 different initializations at the optimal threshold are plotted in Fig.6. Similar experiments with added noisy bifurcations (50% and 80%) are also plotted. Without noise, the VCGM and ACGM methods are faster than BGM. When noise is added, the exploration space is highly increased with BGM. This case requires up to 40 minutes to do the matching and sometimes fails to find the correct match, increasing the mean TRE up to 5.8 mm. Differently, the compliance methods require, on average, less than 12 minutes. ACGM is the fastest with less than 8 minutes on average, and has 4.8 mm mean TRE.
3.1.3 Synthetic deformations using the same matching parameters
Ten different deformations were generated using different pressure, Young’s modulus and gravity orientation (to simulate subject in supine or flank position). Also, a craniocaudal force simulated different respiratory phases. These deformations are summarized in Table 1 with statistics of the bifurcations’ displacements. As in the previous experiments, the graphs have 40% branch removal and two levels of noisy bifurcations added.
Each method uses constant parameters to match all different synthetic datasets deformations.
With BGM, the optimal depends on the deformation magnitude and the noise, therefore being potentially difficult to set in clinical scenarios where no information of the deformation is a priori known. For this reason, optimal threshold found from the previous experiments is used . This threshold, using a cumulative chi-squared distribution, represents a 97% confidence region.
. This threshold, using a cumulative chi-squared distribution, represents a 97% confidence region.
For the VCGM and ACGM methods, the parameters selection is simplified since the takes into account the FE model and the initialization. Thus, the only parameter to set are the maximum incremental displacement magnitudes. Given that the compliance methods are not too sensitive to these parameters, approximate values are sufficient. VCGM uses , whereas ACGM uses and to define .
The Fig. 7 presents the matching time and the TRE measured with synthetic ground truth data of each experiment. ACGM has the same or better (in some cases of deformations D2, D7 and D8) TRE and is faster than the other methods.
Table 2 summarizes the results of the ten deformations at each level of noise. Without noise, the average matching time of ACGM is 1.4 minutes, while it is more than 7 minutes with the other methods. Given that BGM fails to find all the correct matchings, the average TRE is 0.8 mm worse than the other methods. Adding 50% noise, the average matching time of ACGM only increases to two minutes, while it reaches 16 minutes with the other methods. With 80% noise added, the trend is clear. The average matching time of ACGM only increases to 2.7 minutes, while it reaches 23 minutes with BGM. The noise highly affects BGM, with a TRE increasing to 4.5 mm because it did not complete all the experiments as some exhausted the computer RAM memory available. ACGM maintains the best TRE result even with noise.
The maximum incremental displacement range simplifies the correct setting of the compliance Mahalanobis thresholds. This allowed to use constant parameters and to match correctly a diverse set of synthetic deformations, making ACGM very robust and efficient.
3.2 Experiments on real data
From the two porcine liver datasets (PA and PB), each real dataset has one CTA image acquired preoperatively in supine position and one CBCT image acquired after pneumoperitoneum on flank position. These intra-operative conditions generated a large deformation. In both modalities the portal vein is visible, however, the CBCT image has fewer portal vessel branches visible than the CTA and several false branches were segmented due to noise (see Fig. 1, the image intensity statistics in Table 3). From the portal vein automatic segmentations, the source (CTA) and target (CBCT) graphs are extracted 3.
The portal veins graphs are matched to register the CTA data onto the CBCT to augment it. For instance, the hepatic vein is not visible in the CBCT and fusing it from the CTA is clinically useful. Although the number of bifurcations ( in Table 3) is similar in both modalities. Because of CBCT noise, there are several false bifurcations (most of the non-matched yellow cubes in the second row of Fig. 8).
The matching is evaluated with clear and unambiguous manually selected landmarks visible in both modalities. These landmarks include inserted tumors, distinctive vessel points and bifurcations (3, 15 and 19 for PA, while 6, 5 and 11 for PB, respectively). The registration error (RE) is computed using all these evaluation landmarks, while the matching only uses the bifurcations. Therefore, the RE cannot reach zero in these experiments.
The Table 4 presents the registration results for each dataset. First, using the matches obtained with the iGPR initialization, the rigid RE quantifies the nonlinear deformation magnitude. Similarly, the first row of Fig. 8 depicts this deformation. From iGPR initialization, matching was repeated 10 times per method with increasing Mahalanobis thresholds to find the optimal. All these matchings are not needed during a normal registration and are only used to provide a meaningful comparison. On average for the real datasets, ACGM has 2.34 mm and 0.58 mm better RE than BGM and VCGM, respectively. Moreover, ACGM is 43.4 and 18.05 minutes faster than BGM and VCGM, respectively. ACGM is about 12 times faster than BGM and significantly improves the RE. While the RE difference in between the compliance methods is small, ACGM is much faster than VCGM.
3.2.1 Influence of the segmentation of the vessels
Different pre-processing parameters are used to obtain four sets of graphs from a partial side of the PA dataset. Since only a partial side was used, the RE was evaluated only in 28 landmarks (two inserted tumors, 13 distinctive points and 13 bifurcations). Using these graphs sets, the influence of the pre-processing steps on the matching methods is studied.
As before, the optimal Mahalanobis threshold per method was searched from ten different increasing thresholds matchings. The second part of Table 5 shows the optimal result of each method. For a majority of experiments, ACGM was faster, maintaining or even improving the RE. Only in experiment C (4th column of Table 5), the ACGM matching time was about the same as for BGM. However, here the ACGM maximum RE is 3.4 mm better than for BGM.
The third part of Table 5 shows the results obtained with the highest Mahalanobis thresholds used. From the optimal threshold to the highest used, BGM increases the 4 segmentations matching time an average of 28.3 minutes. While, the ACGM average matching time only increases 6.6 minutes.
The fourth part of Table 5 shows the results using the parameters to set the Mahalanobis thresholds. VCGM needs higher matching time than ACGM, up to the point that VCGM did not complete experiments C and D (marked with ’-’) as they exhausted the available RAM memory. ACGM using the threshold selection is faster than the optimal , the average for the 4 segmentations is 14.6 minutes faster. While, the selection average RE is only 0.2 mm larger than the optimal MC threshold.
3.2.2 Influence of geodesic distance accuracy
Using the graphs from previous experiments A and B, ten matchings are done to evaluate the geodesic influence. Fig. 9 shows the results using increasing geodesic and constant Mahalanobis thresholds. When is increased, the VCGM and ACGM methods do not increase the search time as much as BGM.
Due to image noise and pre-processing, the geodesic distance is inaccurate in real data. These inaccuracies represent a problem for BGM, because a small does not allow to match inaccurate geodesic bifurcations (e.g. when <25% in Fig. 9.b). An important advantage of the compliance methods is that the dependence on the geodesic constraint is reduced. This allows to use with ACGM, which in experiment B finds an extra correct match and reduces the RE to 2.9 mm.
4 Discussioninlineinlinetodo: inlineIn Figure 5, despite ACGM is explained as a better method than VCGM, ACGM appears to have a higher sensitivity to medium noise level than VCGM. Can the authors explain why?inlineinlinetodo: inlineMoreover, the same paragraph suggests that unless UTHmin excludes, false matches can be introduced. This sounds arbitrary and a better justification is required.
In the 50% noise case of Section 3.1.2, ACGM is more sensitive to noise than VCGM because was set without using Eq. 5. Setting a correct is hard in the presence of noise that cannot be filter with the geodesic constraint. This specific situation is an ACGM’s limitation that was overcome when Eq. 5 defined in all other experiments.
The experiment in Section 3.2 shows with real data that ACGM is faster and more accurate than BGM. Globally, ACGM correctly registers the two trees, except for few leaf vessels in the upper middle part of Fig. 8c.These few vessels are not well transformed because the corresponding bifurcations are missing in the target segmentation. These errors can be solved after the coarse bifurcation matching presented here, either by improving the fine matching of high deformation edges or using vessels’ end points.
It is hard to directly set the optimal Mahalanobis compliance thresholds. In VCGM, removes the initialization and FE model dependency. However, it still depends on the deformation magnitude, incremental hypotheses generation and scale. A large finds a correct match but increases the matching time. ACGM allows to overestimate to guarantee the correct solution and thanks to the rigid-to-soft approach it increases minimally the matching time. This allowed to set an approximate range that matched efficiently a wide variety of experiments. That’s why ACGM is considered better than VCGM.
The proposed ACGM method is up to one order of magnitude faster than BGM. Besides being faster, ACGM maintains or even improves the RE in real and synthetic experiments. All the parameters of the algorithm, including the incremental maximum displacement range, are easy to set and can work correctly in a wide range of scenarios. The method allows to augment anatomical structures non-visible in the intra-operative image. This improves the visualization of the target and risk anatomy, which is a very important clinical need that could, for example, allow surgeons to reach small structures during image guided procedures.
ACGM was successful even under very large deformations while using an automatic segmentation method on the porcine liver dataset. This robustness was also demonstrated by adding several levels of noise and deformations in the synthetic datasets. The efficient ACGM method handles a large number of noise bifurcations. Still maintaining the matching time within acceptable intra-operative constraints. This alleviates the need of low noise intra-operative automatic image segmentation. The compliance filtering reduces the dependence on geodesic constraints, this could even allow to efficiently match unconnected graphs.
These results are promising but further evaluation is required to validate the in-vivo applicability. First, the impact of different image acquisition conditions (resolution, contrast injection or artifacts) should be studied. Second, experiments on larger human datasets are required to prove the clinical importance. Although the clinical set-up is relatively common, collecting a large dataset with expert manual ground truth (to quantify registration error) requires significant amount of work.
The experiments show that ACGM is close to accurately match unconnected, very noisy and highly deformed graphs. Thus, a faster implementation of ACGM (with more efficient mechanical models, artificial intelligence matching methods, and/or parallel implementations) can allow future US-CT registration studies.
The authors are grateful for the support from Inria, the MIMESIS and MAGRIT teams, and IHU Strasbourg. Jaime Garcia Guevara is supported by the Grand Est region and Inria.
6 Conflict of Interest
We declare that this article is free from conflicts of interest.
Cho, M., J. Lee, and K. M. Lee.
Reweighted random walks for graph matching.
In: Proceedings of the 11th European Conference on Computer Vision: Part V, ECCV’10, pp. 492–505, Berlin, Heidelberg: Springer-Verlag2010.
- 2 Duriez, C., S. Cotin, J. Lenoir, and P. Neumann. New approaches to catheter navigation for interventional radiology simulation. Computer Aided Surgery 11:300–308, 2006.
- 3 Garcia Guevara, J., I. Peterlik, M.-O. Berger, and S. Cotin. Biomechanics-based graph matching for augmented ct-cbct. IJCARS , 2018.
- 4 Groher, M., T. F. Jakobs, N. Padoy, and N. Navab. Planning and intraoperative visualization of liver catheterizations: New cta protocol and 2d-3d registration method. Academic Radiology 14:1325 – 1340, 2007.
- 5 Lange, T., N. Papenberg, S. Heldmann, J. Modersitzki, B. Fischer, H. Lamecker, and P. M. Schlag. 3D ultrasound-CT registration of the liver using combined landmark-intensity information. Int J Comput Assist Radiol Surg. 4:79–88, 2009.
- 6 Leordeanu, M. and M. Hebert. A spectral technique for correspondence problems using pairwise constraints. In: International Conference of Computer Vision (ICCV), volume 2, pp. 1482 – 1489. 2005.
- 7 Martínez, J., J. Dumas, S. Lefebvre, and L.-Y. Wei. Structure and appearance optimization for controllable shape design. ACM Trans. Graph. 34:229:1–229:11, 2015.
- 8 Matl, S., R. Brosig, M. Baust, N. Navab, and S. Demirci. Vascular image registration techniques: A living review. Medical Image Analysis 35:1 – 17, 2017.
- 9 Moriconi, S., M. Zuluaga, H. Rolf Jäger, P. Nachev, S. Ourselin, and M. Jorge Cardoso. Elastic registration of geodesic vascular graphs. In: MICCAI 2018, pp. 810–818. 2018.
- 10 Nesme, M., Y. Payan, and F. Faure. Efficient, physically plausible finite elements. In: Eurographics 2005, Short papers, August, 2005, edited by J. Dingliana and F. Ganovelli. 2005.
- 11 Oktay, O., L. Zhang, T. Mansi, P. Mountney, P. Mewes, S. Nicolau, L. Soler, and C. Chefd’hotel. Biomechanically driven registration of pre- to intra-operative 3d images for laparoscopic surgery. In: Int. Conf. MICCAI, pp. 1–9, Springer2013.
- 12 Peterlík, I., H. Courtecuisse, R. Rohling, P. Abolmaesumi, C. Nguan, S. Cotin, and S. Salcudean. Fast elastic registration of soft tissues under large deformations. Medical image analysis 45:24–40, 2018.
- 13 Peterlík, I., C. Duriez, and S. Cotin. Modeling and real-time simulation of a vascularized liver tissue. In: International Conference on Medical Image Computing and Computer-Assisted Intervention, pp. 50–57. 2012.
- 14 Peters, T. and K. Cleary. Image-Guided Interventions: Technology and Applications, Springer2008.
- 15 Pinheiro, M. A. and J. Kybic. Incremental b-spline deformation model for geometric graph matching. In: ISBI. 2018.
- 16 Pinheiro, M. A., J. Kybic, and P. Fua. Geometric graph matching using monte carlo tree search. IEEE Trans Pattern Anal Mach Intell. 39:2171–2185, 2017.
- 17 Plantefève, R., S. Kadoury, A. Tang, and I. Peterlik. Robust automatic graph-based skeletonization of hepatic vascular trees. In: Intravascular Imaging and Computer Assisted Stenting, and Large-Scale Annotation of Biomedical Data and Expert Label Synthesis, pp. 20–28. 2017.
- 18 Plantefève, R., I. Peterlik, N. Haouchine, and S. Cotin. Patient-specific biomechanical modeling for guidance during minimally-invasive hepatic surgery. Ann Biomed Eng. 44:139–153, 2016.
- 19 Przemieniecki, J. S. Theory of Matrix Structural Analysis, Dover1985, 468 pp.
- 20 Serradell, E., M. A. Pinheiro, R. Sznitman, J. Kybic, F. Moreno-Noguer, and P. Fua. Non-rigid graph registration using active testing search. IEEE Trans Pattern Anal Mach Intell. 37:625–638, 2015.
- 21 Smistad, E., A. C. Elster, and F. Lindseth. GPU accelerated segmentation and centerline extraction of tubular structures from medical images. Int J Comput Assist Radiol Surg. 9:561–575, 2014.
- 22 Sotiras, A., C. Davatzikos, and N. Paragios. Deformable medical image registration: A survey. IEEE Trans Med Imag 32:1153–1190, 2013.
- 23 Tacher, V., A. Radaelli, M. Lin, and J.-F. Geschwind. How i do it: Cone-beam ct during transarterial chemoembolization for liver cancer. Radiology 274:320–334, 2015.
- 24 Xue, H., C. Malamateniou, L. Srinivan, J. Hajnal, and D. Rueckert. Automatic extraction and matching of neonatal cerebral vasculature. In: ISBI 2006, pp. 810–818. 2006.
- 25 Zhou, F. and F. D. la Torre Frade. Factorized graph matching. In: IEEE Trans Pattern Anal Mach Intell. 2015.