1 Introduction
When studying the formal reduction of a system of linear differential equations
(1) 
where
is a vector with
components and a square formal meromorphic power series matrix of dimension of the formwith pole order , the structure of the leading matrix allows to reduce the problem to several problems of smaller size whenever
has several eigenvalues. The wellknown
Splitting Lemma [9] states that if is blockdiagonalwith the additional condition that and have no common eigenvalue, there exists a formal transformation matrix
(2) 
such that the change of variable transforms the system (1) into a new system
(3) 
where
is of same pole order and blockdiagonal with the same block partition as in . The matrix is computed by
(4) 
Using the Splitting Lemma it is hence sufficient to study the case
where the leading matrix in (1) has only one
eigenvalue. Using an exponential shift of the form
where is the unique eigenvalue
of one can (and we will throughout this paper) assume that
is nilpotent.
Several methods for finding transformation matrices which again lead to nonnilpotent leading matrices have been suggested [1, 2, 3, 7, 8, 9]. It can be shown that this, combined with the Splitting Lemma, gives rise to a recursive procedure which decomposes the initial system into new systems for which one has either or . The structure of the matrix of a formal fundamental matrix solution of the system
(5) 
can be determined uniquely through this method. Here is an
invertible formal meromorphic matrix power series in a fractional
power of , is a constant complex matrix commuting
with and is a diagonal matrix containing polynomials in
the same fractional power of without constant
terms.
For the purposes of this paper, it is useful to distinguish between the following two types of transformation matrices:

Matrices containing formal meromorphic power series in the variable , whose determinant is not the zero series. We will refer to this type of transformations as rootfree transformations. Two systems linked as in (4) by such a rootfree transformation shall be called meromorphically equivalent or short equivalent.

Matrices having coefficients which are formal meromorphic power series in a fractional power of , whose determinant is not the zero series. We will call these transformations ramified transformations. If is a ramified transformation, the smallest integer such that is rootfree is called the ramification index of . We shall also say that is a meromorphic transformation and takes a system into a meromorphically equivalent system.
In this paper, we are interested in the situation where one cannot
find transformations of the first type in order to obtain a system
with a nonnilpotent leading matrix. In other words, the
introducing of a ramification is necessary. This can be stated in
terms of formal solutions by saying that the dominant (negative)
power of in the matrix , or alternatively the biggest
slope of the
Newton polygon of the system, is a rational number [2, 5].
In this case, the methods in [1, 3, 8, 9] apply a series of rootfree, ramified and Shearingtransformations (in [2, 7] a different strategy is employed). A Shearingtransformation is a transformation of the form
where and . In [1] it is shown that it is always possible to achieve this by using exponential shifts and a transformation of the form
(6) 
where is a rootfree transformation having a finite number of nonzero terms and is a ramified Shearingtransformation. The transformed system
(7) 
has a coefficient matrix of the form
(8) 
where is relatively prime to and has several
eigenvalues. Applying the Splitting Lemma to (7) then
yields a meromorphic transformation taking the system into a
new system whose coefficient matrix is blockdiagonal. Hence the
remaining computations are carried out on matrices
containing ramified power series.
One may ask under which conditions there exists also a rootfree
transformation which achieves a blockdiagonalisation of the
original system (1) without introducing
ramifications, and how to compute such a transformation.
We shall give a sufficient condition for the existence of such a rootfree transformation and also provide a constructive method for computing its coefficients. Denote by the set of eigenvalues of a complex square matrix . We will prove the following theorem:
Theorem 1.1 (“RootFree Splitting Lemma”).
Consider the system (1) and assume there exists a Shearingtransformation of ramification index taking the system into one of the form (7) such that its leading matrix is similar to a blockdiagonal matrix
and suppose that for all and it holds for all . Then there exists a rootfree transformation with the following properties:

transforms the system (1) into an equivalent system with blockdiagonal coefficient matrix
where the block sizes match those in the matrix .

The matrix has the leading matrix and the same pole order as .

A finite number of coefficients of the rootfree transformation can be computed from a finite number of the coefficients of the system (1).
The classical Splitting Lemma can be seen as a particular case of this theorem by putting
as the identity matrix and
.This paper is organised as following: in Section
2, we review the classical Splitting Lemma.
In the following section we introduce a special class of systems
and give a variant of the Splitting Lemma, particular to this
class. Using this, we will give the proof of Theorem
1.1 in Section 4 and illustrate the
benefits of our theorem concerning the formal
reduction in Section 5 on an example.
Notations: Throughout the paper, empty entries in matrices are supposed to be filled with 0. We write for a (block)diagonal matrix whose diagonal entries are the . The valuation of a polynomial or formal power series (with possibly negative or fractional exponents) is the smallest occurring power in the variable . Other definitions of notations are made as they appear in the text.
2 Review of the Splitting Lemma
As we have previously mentioned, the Splitting Lemma is a wellknown result. Its proof is carried out in a constructive fashion and gives a method for computing the coefficients of the transformation matrix as in (2), see for example [1, 2, 9]. We repeat it here for reason of completeness. Also, we will formulate it for meromorphic systems in preparation of the proof of Lemma 3.4.
Lemma 2.1.
Consider the system (7) and assume that is blockdiagonal
such that
Then there exists a formal meromorphic transformation of the form
such that the transformed system is blockdiagonal with the same block partition as in .
Proof We use a transformation of the special form
with . Denote by the matrix . Inserting the series expansion for and and comparing coefficients gives the recursion formula
(9) 
where for . Equation (9) is of the form
(10) 
where
depends only on with and with . Using the special form of
and decomposing into blockstructure accordingly gives the following system of equations:
(11)  
(12) 
where and are unknown, and
(13)  
with unknowns and . Given , the first two equations (11) and (12) can be solved by setting and . The remaining equations can be solved uniquely for and because the matrices and have no eigenvalues in common, see e.g. [4].
3 On Commutative Systems
In this section, we study a particular class of meromorphic
systems. Starting point of our considerations was [1, Lemma 5,
Section 3.3] observing that a system transformed by a
Shearingtransformation has a special structure. We will state
this more generally and give conditions under which this special
structure is preserved by the SplittingLemma.
Definition 3.1.
Remark 3.1.
The considerations in [1] correspond, in our notation,
to the case of a commutative system where is
an invertible diagonal matrix. However, this restriction is not
necessary in this section and we will develop our theory
first for arbitrary matrices .
Remark 3.2.
Two complex matrices and satisfying
are called commutative in [6]. In terms
of their notation, the th coefficient of a commutative matrix and the
matrix are commutative.
The following lemma gives an alternative characterisation for commutative systems which will be useful later. Note that similar concepts are used in [1].
Lemma 3.1.
Consider a system of the form (7). Then the following two statements are equivalent:

is commutative.

.
Proof A direct calculation shows:
It is also straightforward to see that we have
Lemma 3.2.
Consider a system of the form (7) and suppose is commutative where denotes the identity matrix. Then is an unramified formal meromorphic power series matrix.
We make the following definition: for two eigenvalues and of we define an equivalence relation by
and denote by the set
where we will, slightly abusing notation, identify with
.
Given , it is clear that the matrix is commutative.
Lemma 3.3.
Let as in (8) be commutative and let and be two eigenvalues of with . Then there exists such that is commutative and
with , , and .
Proof The existence of a matrix so that satisfies the conditions of the lemma can be seen easily from elementary properties of matrix decomposition. We will use techniques similar as in [6] in order to show that has the required block diagonal structure. Let
where the block partition matches that in the matrix . Inserting into the equation yields in particular the two conditions
and
The first of these two equations is of the form
The assumption implies that the matrices and have no eigenvalue in common. The above equation therefore has the unique solution . A very similar argument applies to the second equation. This proves the lemma.
Remark 3.3.
The matrices , and are in general
not uniquely determined.
We now show that for a commutative system which has blockdiagonal structure as in Lemma 3.3, application of the Splitting Lemma preserves the property of being commutative.
Lemma 3.4 (“Splitting Lemma for Commutative Systems”).
Consider the system (7) and assume that is commutative with and blockdiagonal with blocks of same dimension
such that
Then there exists a commutative meromorphic transformation of the form
(14) 
such that the transformed system is commutative and blockdiagonal with the same block partition as in and .
Proof The existence of the transformation is given by Lemma 2.1, the classical Splitting Lemma. What remains to show is that and the transformed system are commutative. Denote by the coefficient matrix of the transformed system. Using the notations as in (9) and (10), we will show that the following relations hold:
(15)  
(16)  
(17) 
for . The case holds trivially by putting
since and . Let be an arbitrary positive integer. We will see
that if the above relations hold for
then they hold for . The claim follows then by induction.
We compute
where we have used (16) and (17) for and the assumption that is commutative.
This proves (15) for .
We decompose into blocks accordingly to the block structure of and and find using (11)
We can show an analogous relationship for
and using (12). Hence we can see that (17)
holds for .
It remains to show (16), which is equivalent to showing
(18)  
(19) 
We will only show that the first of these two equations holds, the second can be dealt with similarly. Multiplying (13) with on the left and with on the right and combining the two equations yields
This equation is of the form
The assumption implies that the matrices and have no eigenvalue in common. The above equation therefore has the unique solution , from which we conclude (18). This completes the proof of the lemma.
Remark 3.4. We observe that the two block matrices in the transformed system are commutative and commutative respectively.
4 A RootFree Splitting Lemma
We define a generalised Shearingtransformation as a transformation of the form where is a Shearingtransformation and .
Proposition 4.1.
Consider a system as in (7) with leading matrix and let . The following statements are equivalent:

There exists a system as in (1) and a generalised Shearingtransformation of ramifications index such that .

The system (7) is commutative, the matrix is similar to a diagonal matrix and . Furthermore, if is an eigenvalue of with multiplicity , the numbers are all eigenvalues of the same multiplicity .
Proof We proof : let be the generalised Shearingtransformation. Since where with , we find with
showing that is commutative where
satisfies the stated properties. The claimed symmetry
in the spectrum of can be shown as in the proofs of
[1, Lemma 5, Section 3.3] and [6, Theorem 5]
since we have
and is a primitive th root of unity.
In order to prove the converse direction, we first assume that with and define the Shearingtransformation
We observe that . Transform the given system using this transformation and denote the coefficient matrix of the transformed system by . We compute
showing that is commutative and hence (Lemma 3.2) must be a unramified formal meromorphic power series matrix. The case of the general matrix follows by first applying a constant similarity transformation which diagonalises .
We can now give the proof of our main theorem.
Proof of Theorem 1.1 Let such that has a leading matrix as in the assumptions of the Theorem. In a similar way as in the proof of Lemma 3.3, we can see that is blockdiagonal, matching the block structure of . Using this and Proposition 4.1, we obtain that is commutative where is similarly blockdiagonal. Note that and being relatively prime, the two conditions and are equivalent. We can therefore apply Lemma 3.4 to in order to obtain a commutative transformation matrix such that is commutative and blockdiagonal with matching block structure.
We claim that the transformation matrix
is rootfree satisfying the desired properties of the theorem. It is clear that is blockdiagonal. But one verifies that is commutative and hence is rootfree. The remaining properties follow immediately.
5 Application for the Formal Reduction
Consider the situation where the system (1) is meromorphically equivalent to a system as in (7) whose leading matrix has several eigenvalues but is not invertible. The exponential matrix polynomial in a formal fundamental matrix solution (5) is then
where the leading terms of diagonal entries of the form
are given by nonzero eigenvalues of . The
diagonal entries having valuation greater than
correspond to eigenvalues zero. In particular, these entries might
involve ramifications different to or no ramifications at all.
Algorithms using the classical Splitting Lemma will not be able to
compute these entries without first introducing the ramification .
In order to see how we can remedy this situation, we use the fact that is similar to a matrix of the form
where is invertible and is nilpotent. Hence the conditions for Theorem 1.1 are satisfied and we will obtain a rootfree transformation which splits the system into
(20) 
This makes it possible to work independently on the two matrices
and : for the first matrix we can use a
Shearingtransformation introducing the (necessary) ramification
. For the second matrix however we now recursively apply the
formal reduction process.
In order to illustrate this approach, consider the following example with , and
The Shearingtransformation with
transforms the system into a system of the form (7) with ramification index , and blockdiagonal leading matrix with the two blocks
and the condition of Theorem 1.1 is satisfied since the first block matrix is invertible and the second is nilpotent. We obtain a rootfree transformation which is of the form
with (we only have computed the first couple of terms)
and
Applying this transformation to the original system yields the following blockdiagonal matrix:
Applying the Shearingtransformation to the first block
matrix will result in a ramified system of smaller size and
invertible leading matrix equalling the first block of
. The formal reduction can now be applied to the second
block. In this example, it is found that another
Shearingtransformation of ramification index results in a
system with invertible leading matrix. This decomposition can be
interpreted as separation of the different slopes of the
Newtonpolygon, see Theorem
5.1 below.
We conclude that if the algorithm employed for computing the
transformation (4) keeps the introduced
ramification minimal (as for example the algorithm in
[1]), using the rootfree Splitting Lemma allows the
recursive computation of using minimal ramifications. This
approach leads to a rootfree transformation taking a system of
the form (1) into a system from which all the leading
terms of the matrix (or, alternatively
the Newton polygon) can be determined directly. We state this as
Theorem 5.1.
The given system (1) is meromorphically equivalent to a system where the matrix is blockdiagonal
with and there exist a diagonal transformation with blocks of same sizes
with Shearingtransformations of ramification index () such that each of the matrices has either no pole at or an invertible leading matrix. In the latter case, let be the pole order and the leading matrix of . Then and if is an eigenvalue of with multiplicity , the eigenvalues are all of the same multiplicity . There are diagonal entries in the matrix of the form
where the dots denote terms with higher powers of . The Newton polygon of the system corresponding to this block admits a single slope of length