1 Introduction
With the introduction of diffusion tensor magnetic resonance imaging (DTMRI) [4]
, there has been an ever increasing demand on rigorous, reliable and robust methods for the processing of tensorvalued data such as the estimation, filtering, regularization and segmentation. Many well established PDEbased methods used for the processing of scalarvalued data have been extended in various ways to the processing of multivalued data such as vectorvalued data and smoothly constrained data
[5, 11, 20, 21, 22, 23]. Recently, some efforts have been directed toward the extension of these methods to tensor fields [3, 8, 7, 13, 16, 24, 25]. The generalization of the methods used for scalar and vectorvalued data to tensorvalued data is being pursued with mainly three formalisms: the use of geometric invariants of tensors like eigenvalues, determinant, trace; the generalization of Di Zenzo’s concept of a structure tensor for vectorvalued images to tensorvalued data; and recently, differentialgeometric methods.
The aim of the present paper is to generalize the total variation (TV) flow, mean curvature motion (MCM), modified mean curvature flow and self snakes to tensorvalued data such as DTMRI. The key ingredient for these generalizations is the use of the Riemannian geometry of the space of symmetric positivedefinite (SPD) matrices. The remainder of this paper is organized as follows. In Section 2 we give a compilation of results that gives the differential geometry of the Riemannian manifold of symmetric positivedefinite matrices. In Section 3 we fix notation and recall some facts about immersions between Riemannian manifolds and their mean curvature. We explain in Section 4 how to describe a DTMR image by differentialgeometric concepts. Section 5 is the key of our paper in which we extend several mean curvaturebased flows for the denoising and segmentation from the scalar and vector setting to the tensor one. In Section 6 we present some numerical results.
2 Differential Geometry of
Positivedefinite matrices are omnipresent in many engineering and physical contexts. They play important roles in various disciplines such as control theory, continuum mechanics, numerical analysis, covariance analysis, signal processing, etc. Recently, they gained an increasing attention within the diffusion tensor magnetic resonance imaging (DTMRI) community as they are used as an encoding for the principal diffusion directions ans strengths in biological tissues.
We here recall some differentialgeometric facts about the space of symmetric positivedefinite matrices that have been recently published by the authors. We denote by the vector space of symmetric matrices. A matrix is said to be positive semidefinite if for all , and positive definite if in addition is invertible. The space of all symmetric, positivedefinite matrices will be denoted by . We note that the set of positivesemidefinite matrices is a pointed convex cone in the linear space of matrices, and that is the interior of this cone. It is a differentiable manifold endowed with a Riemannian structure. The tangent space to at any of its points is the space , which for simplicity is identified with . On each tangent space we introduce the base pointdependent inner product defined by .
This inner product leads to a natural Riemannian metric on the manifold that is given at each by the differential
(1) 
where is the symmetric matrix with elements . We note that the metric (1) is invariant under congruent transformations: and under inversion .
For an matrix we denote by the column vector that is obtained by stacking the columns of . If is symmetric, then then elements of are redundant. We will denote by the vector that is obtained from by eliminating the redundant elements, e.g., all supradiagonal elements of . We note that there are several ways to arrange the independent elements of into . In any case, there exists a unique matrix, called the duplication matrix and denoted by , that by duplicating certain elements, reconstructs from , i.e., is the matrix such that
(2) 
The duplication matrix , which has been studied extensively by Henderson and Searle [9], and by Magnus and Neudecker [14], has full column rank . Hence, is nonsingular and it follows that the duplication matrix has a MoorePenrose inverse denoted by and is given by
It follows from (2) that
(3) 
By using the vector as a parametrization of we obtain the matrix of components of the metric tensor associated with the Riemannian metric (1) is given explicitly by [25]
(4) 
For differentialgeometric operators on it is important to obtain the expression of the inverse of the metric and that of its determinant. The matrix of components of the inverse metric tensor is given by
(5) 
and the determinant of is
(6) 
In the coordinate system , the Christoffel symbols are given by [25]
where is the dual basis associated with the local coordinates . As the elements of and are either 0, 1, or , it follows from the above theorem that each nonvanishing Christoffel symbol is given by an element of or half of it.
Let be an element of and let be a (symmetric) infinitesimal variation of it
Hence, the complete and reduced vector forms of are respectively,
The components of the inverse metric tensor and the Christoffel symbols are given explicitly in the appendix.
3 Immersions and Mean Curvature
Let and be two connected Riemanian manifolds of dimensions and , respectively. We consider a map that is of class , i.e., . Let be a local coordinate system of in a neighborhood of a point and let be a local coordinate system of in a neighborhood of .
The mapping induces a metric on defined by
(7) 
This metric is called the pullback metric induced by , as it maps the metric in the opposite direction of the mapping .
An isometry is a diffeomorphism that preserves the Riemannian metric, i.e., if and are the metrics for and , respectively, then . It follows that an isometry preserves the length of curves, i.e., if is a smooth curve on , then the curve is a curve of the same length on . Also, the image of a geodesic under an isometry is again a geodesic.
A mapping is called an immersion if is injective for every point in . We say that is immersed in by or that is an immersed submanifold of . When an immersion is injective, it is called an embedding of into . We then say that is an embedded submanifold, or simply, a submanifold of .
Now let be an immersion of a manifold into a Riemannian manifold with metric . The first fundamental form associated with the immersion is . Its components are where . The total covariant derivative is called the second fundamental form of and is denoted by . The second fundamental form takes values in the normal bundle of . The mean curvature vector of an isometric immersion is defined as the trace of the second fundamental form divided by [10]
(8) 
In local coordinates, we have [10]
(9) 
where are the Christoffel symbols of and is the LaplaceBeltrami operator on given by
(10) 
4 DiffusionTensor MRI Data as Isometric Immersions
A volumetric tensorvalued image can be described mathematically as an isometric immersion of a threedimensional domain in the fiber bundle , which is a ninedimensional manifold. We denote by the image manifold and its metric and by the target manifold and its metric. Here and . Consequently, a tensorvalued image is a section of this fiber bundle. The metric of is given by
(11) 
The target manifold , in this context is also called the spacefeature manifold [23]. We can rewrite the metric defined by (11) as the quadratic form
where . The corresponding metric tensor is
where is the metric tensor of as defined in Section 2.
Since the image is an isometric immersion, we have . Therefore
(12) 
We note that is the codimension of . In compact form, we have
(13) 
where is given by (4). (We take for a slice and for a volumetric DTMRI image.)
5 Geometric CurvatureDriven Flows for TensorValued Data
The basic concept in which geometric curvaturedriven flows are based is the mean curvature of a submanifold embedded in a higher dimensional manifold. Here we generalize the scalar mean curvature flow to mean curvature flow in the spacefeature manifold . For this, we embed the Euclidean image space into the Riemannian manifold , and use some classical results from differential geometry to derive the Riemannan Mean Curvature (RMC). We then use the RMC to generalize mean curvature flow to the tensorvalued data. Given the expression of the mean curvature vector , we can establish some PDEs based tensorimage filtering. Especially, we are interested of the so called levelset methods, which relay on PDEs that modify the shape of level sets in an image.
5.1 Riemannian Total Variation Flow
The total variation norm (TV) method introduced in [19] and its reconstructions have been successfully used in reducing noise and blurs without smearing sharp edges in greylevel, color and other vectorvalued images [6, 12, 17, 20, 21, 22]. It is then natural to look for the extension of the TV norm to tensorvalued images.
The TV norm method is obtained as a gradientdecent flow associated with the norm of the tensor field. This yields the following PDE that express the motion by the mean curvature vector
(14) 
This flow can be considered as a deformation of the tensor field toward minimal immersion. Indeed, it derives from variational setting that minimize the volume of the embedded image manifold in the spacefeature manifold.
5.2 Riemannian Mean Curvature Flow
The following flow was proposed for the processing of scalarvalued images
(15) 
where is the grey level of the image to be processed, is its smoothed version that depends on the scale parameter .
The “philosophy” of this flow is that the term represents a degenerate diffusion term which diffuses in the direction orthogonal to its gradient and does not diffuse at all in the direction of .
This formulation has been proposed as a “morphological scale space” [2] and as more numerically tractable method of solving total variation [15].
The natural generalization of this flow to tensorvalued data is
(16) 
where
We note that several authors have tried to generalize curvaturedriven flows for tensorvalued data in different ways. We think that the the use of differentialgeometric tools and concepts yield the correct generalization.
5.3 Modified Riemannian Mean Curvature Flow
To denoise highly degraded images, Alvarez et al. [1] have proposed a modification of the mean curvature flow equation (15) that reads
(17) 
where is a smoothing kernel (a Gaussian for example), is therefore a local estimate of for noise elimination, and is a nonincreasing real function which tends to zero as . We note that for the numerical experiments we have used .
The generalization of the modified mean curvature flow to tensorfield processing is
(18) 
The role of is to reduce the magnitude of smoothing near edges. In the scalar case, this equation does not have the same action as the PeronaMalik equation of enhancing edges. Indeed, PeronaMalik equation has variable diffusivity function and has been shown to selectively produce a “negative diffusion” which can increase the contrast of edges. Equation of he form (17) have always positive or forward diffusion, and the term merely reduces the magnitude of that smoothing. To correct this situation, Sapiro have proposed the selfsnakes formalism [20], which we present in the next subsection and generalize to the matrixvalued data setting.
5.4 Riemannan SelfSnakes
The method of Sapiro, which he names selfsnakes introduces an edgestopping function into mean curvature flow
(19) 
Comparing equation (19) to (17), we observe that the term is missing in the old model. This is due to the fact that the Sapiro model takes into account the image structure. Indeed, equation (19) can be rewritten as
(20) 
where
The term is as in the anisotropic flow proposed in [1]. The second term in (20), i.e., , increases the attraction of the deforming contour toward the boundary of “objects” acting as the shockfilter introduced in [18] for deblurring. Therefore, the flow is a shock filter acting like the backward diffusion in the PeronaMalik equation, which is responsible for the edgeenhancing properties of self snakes. See [20] for detailed discussion on this topic.
We are now interested in generalizing SelfSnakes method for the case of tensorvalued data. We will start the generalization from equation (20) in the following manner
(21) 
where
(22) 
This decomposition is not artificial, since the covariant derivative on follow the same chain rule as the Euclidean directional derivative: let
a vector field on which components are , and let a scalar function. From the classic differential geometry we have(23) 
and in compact form
(24) 
6 Numerical Experiments
In Fig. 1 (left), we give a slice of a 3D tensor field defined over a square in . We note that a symmetric positivedefinite matrix
is represented graphically by an ellipsoid whose principal directions are parallel to the eigenvectors of
and whose axes are proportional to the eigenvalues of . Figure 1 (right) shows this tensor field after the addition of noise. The resultant tensor fieldis used as an initial condition for the partial differential equations (
21) which we solve by a finite difference scheme with Neumann boundary conditions. We used 50 time steps of 0.01.s. Figure 2 represents the tensor smoothed by (21).In this paper we generalized several curvaturedriven flows of scalar and vectorvalued data to tensorvalued data. The use of the differentialgeometric tools and concepts yields a natural extension of these wellknown scalarvalued data processing methods to tensorvalued data processing.
References
 [1] L. Alvarez, PL. Lions and JM. Morel, Image selective smoothing and edge detection by nonlinear diffusion (II), SIAM J. Num. Anal., 29 (1992), pp. 845–866.
 [2] L. Alvarez and JM. Morel, A Morphological Approach to Multiscale Analysis: From Principles to Equations, Kluwler Academic Publishers, 1994.
 [3] V. Arsigny, P. Fillard, X. Pennec, and N.Ayache, Fast and simple calculus on tensors in the LogEuclidean framework, in Proc. 8th Int. Conf. on Medical Image Computing and ComputerAssisted Intervention  MICCAI 2005, Part I, J. Duncan and G. Gerig, eds., vol. 3749 of LNCS, Palm Springs, CA, 2005, Springer Verlag, pp. 115–122.
 [4] P. J. Basser, J. Matiello, and D. LeBihan, MR diffusion tensor spectroscopy and imaging, Biophysical J., 66 (1994), pp. 259–267.
 [5] P.V. Blomgren and T.F. Chan,Color TV: Total variation methods for restoration of vector valued images, IEEE Trans. Image Processing, 7 (1998), pp. 304–378.
 [6] A. Cumani, Edge detection in multiscale images, CVGIP: Graphical Models and Image Processing, 53 (1991), pp. 40–51.

[7]
R. Deriche, D. Tschumperlé, C. Lenglet, and M. Rousson, Variational approaches to the estimation, regularization and segmentation
of diffusion tensor Images
, in Mathematical Models in Computer Vision: The Handbook, N. Paragios, Y. Chen and O. Faugeras, eds., Springer, 2005.
 [8] C. Feddern, J. Weickert, B. Burgeth, and M. Welk, Curvaturedriven PDE methods for matrixvalued images, Int. J. Comput. Vision, 69 (2006), pp. 93–107.
 [9] H. V. Henderson and S. R. Searle, Vec and vech operators for matrices, with some uses in Jacobians and multivariate statistics, Canad. J. Statist., 7 (1979), pp. 65–81.
 [10] J. Jost, Riemannian Geometry and Geometric Analysis, Springer, Berlin, 2nd ed., 1998.
 [11] R. Kimmel, R. Malladi, and N. Sochen, Images as embedded maps and minimal surfaces: movies, color, texture, and volumetric medical images, Int. J. Compt. Vision, 39 (2000), pp. 111–129.
 [12] HC. Lee and D.R. Cok., Detecting boundaries in a vector field, IEEE Trans. Signal Proc., 39 (1991), pp. 1181–1194.

[13]
C. Lenglet, M. Rousson, R. Deriche, and O. Faugeras,
Statistics on the manifold of multivariate normal distributions: Theory and application to diffusion tensor MRI processing
, J. Mathematical Imaging and Vision, (2006), in Press.  [14] J. R. Magnus and H. Neudecker, The elimination matrix: some lemmas and applications, SIAM J. Alg. Disc. Meth., 1 (1980), pp. 422–449.
 [15] A. Marquina and S. Osher, Explicit algoritms for a new time dependent model based on level set motion for nonlinear deblurring and noise removal, SIAM J. Sci. Compt., 22 (2000), pp. 378–405.
 [16] M. Moakher and P. G. Batchelor, The symmetric space of positive definite tensors: From geometry to applications and visualization, in Visualization and Processing of Tensor Fields, J. Weickert and H. Hagen, eds., Berlin, 2005, Springer, pp. 285–298.
 [17] R. Neviata, A color edge detector and its use in scene segmentation, IEEE Trans. Syst. Man. Cybern., 7 (1977), pp. 820–826.
 [18] S. Osher and L. Rudin, Featureoriented image enhancement using shock filters, SIAM Journal of Numerical Analysis, 27(4) (1990), pp. 919–940.
 [19] L. Rudin, S. Osher, and E. Fatemi, Nonlinear Total Variation Based Noise Removal Algorithms, Physica D, 60 (1992), pp. 259–268.
 [20] G. Sapiro, Color snakes., Technical Report HPL95113, Hewlett Packard Computer Peripherals Laboratory, September (1995).

[21]
G. Sapiro, Vectorvalued active contours
, In Proceedings of Computer Vision and Pattern Recognition (CVPR’96), (1996), pp. 520–525.
 [22] G. Sapiro and D. Ringach, Anisotropic diffusion of multivalued images with applications to color filtering, IEEE Transactions Image Processing., 5 (1996), pp. 1582–1586.
 [23] N. A. Sochen, R. Kimmel, and R. Malladi, A general framework for low level vision, IEEE Trans. Image Process., 7 (1998), pp. 310–318.
 [24] J. Weickert and H. Hagen, eds., Visualization and Processing of Tensor Fields, Springer, Berlin, 2005.
 [25] M. Zéraï and M. Moakher, The Riemannian geometry of the space of positivedefinite matrices and its application to the regularization of diffusion tensor MRI data, Submitted to: J. Mathematical Imaging and Vision, (2006).
Appendix
We give here the explicit form of the inverse metric tensor and Christoffel symbols for the Riemannian metric on . The components of the inverse metric tensor are given by
The determinant of is . Let , where is the adjoint matrix of .
The Christoffel symbols are arranged in the following six symmetric matrices:
Comments
There are no comments yet.