1 Mathematical Framework
In order to study the influence of a timedependent fitness landscape on the dynamics of a genetic algorithm (GA), we consider GAs to be discrete dynamical systems. A detailed introduction to the resulting dynamical systems model is given by Rowe [10] (in this book). Here, we will only shortly introduce the basic concepts and the notations we use within the present work.
The GA is represented as a generation operator acting on the space of all populations of size for some given encoding of the population members. If we choose the members to be encoded as bitstrings of length , this state space is given by
where denotes the number of bitstrings in the population, which are equal to the binary representation of .
The generation operator maps the present population onto the next generation,
This is achieved by applying a sampling procedure that draws the members of the next generation’s population according to their expected concentrations which are defined by the mixing [10, 11] and the selection scheme. For an infinite population size, the sampling acts like the identity resulting in
Hence, represents in fact the mixing and selection scheme. For finite population size, is approximated by using the sampling process to obtain . The deviations thereby possible become larger with decreasing
and distort the finite population dynamics as compared to the infinite population case. This results in fluctuations and epoch formation as shown in
[10, 11, 12]. In the following, we will consider the infinite population limit, because it reflects the exact flow of probabilities for a particular fitness landscape. In a second step, the fluctuations and epoch formation introduced by the finiteness of a real population can be studied on the basis of that underlying probability flow.
The generation operator is assumed to decompose into a separate mutation and a separate selection operator, like
(1) 
where the selection operator contains the time dependency of the fitness landscape. Crossover is not considered in this work.
Inspired by molecular evolution, and also by common usage, we assume that the mutation acts like flipping each bit with probability . If we set the duration of one generation to , equals to the mutation rate. The mutation operator then takes on the form
where
denotes the Kronecker (or canonical tensor) product and
denotes the Hamming distance of and .To keep the description analytically tractable, we will focus on fitnessproportionate selection,
and 
This will already provide us with some insight into the general behavior of a GA in timedependent fitness landscapes.
Since the GA corresponding to Eq. 1 applies mutation to the current population and selects the new population with complete replacement of the current one, it is called a generational GA (genGA). In addition to genGAs, steadystate GAs (ssGAs) with a two step reproduction process are also in common use: First, a small fraction of the current population is chosen to produce
mutants according to some heuristics. Second, another fraction
of the current population is chosen to get replaced by those mutants according to some other heuristics (see [14, 15, 16] and references therein). We can include ssGAs into our description in an approximate fashion by simply bypassing a fraction of the population into the selection process without mutation, whereas the remaining fraction gets mutated before it enters the selection process. The generation operator then reads(2) 
By varying within the interval
, we can interpolate between steadystate behavior (
ssGA) for and generational behavior (genGA) for . Equation 2 is only an approximation of the true generation operator for ssGAs because the heuristics involved in the choice of the mutated and replaced members are neglected. But in the next section, the heuristics are expected to play a minor role for our general conclusion on an inertia of ssGAs against timevariations.At this point, we want to review shortly the correspondence of our GA model with the quasispecies model, extensively studied by Eigen and coworkers [6, 7, 8] in the context of molecular evolution theory (see also [13] in this book). The quasispecies model describes a system of selfreplicating entities (e. g. RNA, DNAstrands) with replication rates and an imperfect copying procedure such that mutations occur. For simplicity reasons, the overall concentration of molecules in the system is held constant by an excess flow . In the above notation, the continuous model reads
(3) 
where the flux needs to equal the average replication,
, in order to keep the concentration vector
normalized. This model might then be discretized via , which unveils the similarity to a ssGA:(4) 
By comparison with Eq. 2, we can easily read off that . This means a low (resp. high) average fitness leads to a small (resp. large) replacement – a property that is not wanted in the context of optimization problems, which GAs are usually used for, because one does not want to remain in a region of low fitness for a long time. Another difference to ssGAs is the fact that in the continuous Eigen model, selection acts only on the mutated fraction of the population – although this leads only to subtle differences in the dynamics of ssGAs and the Eigen model.
Equation 3 is commonly referred to as ‘continuous Eigen model’ in the literature, because of the continuous time, and Eq. 4 is simply its discretized form which can be used for numerical calculations. Nonetheless, the notion ‘discrete Eigen model’ is seldom used for Eq. 4 but it is often used for the genGA,
(5) 
in the literature. This stems from the identical asymptotic behavior of Eqs. 4 and 5 for static fitness landscapes. However, there are differences for timedependent fitness landscapes, as we will see in the following two sections.
2 Regular Changes and Generalized Quasispecies
In the case of a static landscape, the fixed points of the generation operator, which are in fact stationary states of the evolving system (if contained within , see [10]
), can be found by solving an eigenvalue problem, because of
(6) 
Let and
denote the eigenvalues and eigenvectors of
with descending order and . For the PerronFrobenius theorem assures the nondegeneracy of the eigenvector to the largest eigenvalue and moreover it assures . Often, is called Perron vector. After a transformation to the basis of the eigenvectors it can be straightforwardly shown that converges to for . The population represented by was called the ‘quasispecies’ by Eigen, because this population does not consist of only a single dominant genotype, or string, but it consists of a particular stable mixture of different genotypes.Let us now consider timedependent landscapes. If the time dependency is introduced simply by a single scalar factor, like
it immediately drops out of the selection operator for GAs. For the continuous Eigen model, we note that the eigenvectors of and are the same and that . Since , which is necessary to keep the fitness values positive, the order of the eigenvalues remains, such that will show the same quasispecies as . Contrasting to that special case, a general, individual time dependency of the string’s fitnesses does indeed change the eigenvalues and eigenvectors of compared to . For an arbitrary time dependency the Perron vector is constantly changing, and therefore, we cannot even define a unique asymptotic state. However, this problem disappears for what we call regular changes. After having established a theory for such changes, we can then take into account more and more nonregular ingredients. What do we mean by “regular change”? We define it heuristically in the following way: a regular change is a change that happens with fixed duration and obeys some deterministic rule that is the same for all change cycles. Let us express the latter more formally and make it more clear what we mean by “same rule of change”. Within a change cycle, we allow for an arbitrary time dependency of the fitness, up to the restriction that two different change cycles must be connected by a permutation of the sequence space. Thus, if the time dependency is chosen for one change cycle, e. g. the first change cycle starting at , it is already fixed for all other cycles, apart from the permutations. We will represent permutations from the permutation group of the sequence space as matrices
The permutations of vectors and matrices are obtained by
where denotes the transpose of with the property .
In reference to the first change cycle, we define the fitness landscape as being singletimedependent, if and only if for each change cycle there exists a permutation , such that for all cycle phases
We will call each permutation a jumprule, or simply rule, which connects and . To make predictions about the asymptotic state of the system, we need to relate the generation operators of different change cycles to each other. This is readily achieved if the permutations commute with the mutation operator . The condition for this being the case is that for all ,
Thus, the Hamming distances need to be invariant under the permutations . Geometrically this means that the fitness landscape gets “translated” or “rotated” by those permutations without changing the neighborhood relations. Then, we find for arbitrary and ,
(7) 
To study the asymptotic behavior of the system, it is useful to accumulate the time dependency of a change cycle by introducing the generation operators,
Because of Eq. 7, all these operators are related to by
This property allows us to write the time evolution of the system in the form
(8) 
where denotes in the following always the phase within a cycle.
Let us consider the special case of a single rule being applied at the end of each change cycle, which results in , e. g. imagine a fitness peak that moves at a constant “velocity” through the string space. We will see below that for those cases it is possible to identify the asymptotic state with a quasispecies in analogy to static fitness landscapes. Because of that, we can now define the notion of regularity of a fitness landscape formally in the following manner:
A timedependent fitness landscape is regular, if and only if: (i) the fitness landscape is singletimedependent, (ii) there exists some rule which is applied at the end of each cycle such that , and (iii) the rule commutes with the mutation operator .
In this case, we get with the time evolution
(9) 
To proceed, it is useful to permute the concentrations compatible to the rule of the fitness landscape. By this, concentrations are measured in reference to the fitness landscape structure of the start cycle . We will denote those concentrations by and they are related to the concentrations by
(10)  
For example, if there is no timedependency within the cycles, some will for all cycles measure the concentration of the highest fitness string, independent of its current position in string space. Thus, evolves in a fitness landscape with periodic change, which can also be seen from the second line of Eq. 10. In analogy to the static case Eq. 6, the calculation of fixed points of is equivalent to an eigenvalue problem,
where is the unnormalized generation operator obtained from the accumulation of the unnormalized generation operators .
The corresponding periodic quasispecies can be calculated for all phases of the change cycle from the Perron vector of in the following way,
(11) 
To find the asymptotic states of the concentrations , we simply need to invert Eq. 10,
(12) 
where is the order of the group element .
The essential reason for the existence of asymptotic states for lies in the finiteness of the permutation group . Because of , we find directly from Eq. 9 the asymptotic state
where is the same as in Eq. 11, because and have the same eigenvectors, in particular the same Perron vector. Moreover, we get
(13) 
which is the same result as Eqs. 11 and 12 yield. In the limit of long strings , is not necessarily finite anymore. If , then the asymptotic states Eq. 13 for do not exist, but Eq. 11 still holds. Hence, a quasispecies exists even in the limit if measured in reference to the structure of the fitness landscape.
In conclusion, Eqs. 11 and 13 represent the generalized quasispecies for the class of regular fitness landscapes which includes as special cases static and periodic fitness landscapes. In fact, the simplest case of a regular change is a periodic variation of the fitness values because no permutations are involved () and hence for all . The quasispecies was generalized for this case already in [17] and – using a slightly different formalism – in [4]. In Section 4, we will study a more complicated example.
3 Schematic Phase Diagram
To get an intuitive feeling for the typical behavior of ssGAs and genGAs, let us consider some special lines in the plane spanned by the mutation rate and the time scale for changes , as shown in Fig. 1. The mutation operator represents only for a copying procedure with occurring errors, whereas for it systematically tends to invert strings, i. e. it resembles an inverter with occurring errors. Since mutation should introduce weak modifications to the strings, we will consider only .
 Disorder line:

For , the Perron vector of is always . The population will therefore converge towards the disordered state. Because of the continuity of in , we already enter a disordered phase for .
 Timeaverage region:

For , the mutation operator is the identity. We find as time evolution simply the product average over the fitness of the evolved time steps:
Since diagonal operators commute, the order in which the get multiplicated does not make any difference. For the case of a periodic landscape, is independent of . The quasispecies is then a linear superposition of the eigenvectors of the largest eigenvalue of the product averaged fitness landscape – there might be more then one such eigenvector, since is diagonal and the PerronFrobenius theorem does not apply. Because of the continuity of in the dynamics are governed already for by the product average . Analogous conclusions apply to those nonperiodic landscapes for which by choosing a suitable time scale a meaningful average can be defined.
For ssGAs, is small and we find to first order in :
If holds, the time evolution is governed by . For changes on a time scale , we find timeaveraged behavior if . Thus, the width of the timeaverage region is proportional to . A detailed analysis of the effect of the different positions of the mutation operator within the term, which is otherwise an arithmetic timeaverage, has not yet been carried out.
 Quasistatic region:

If the changes happen on a time scale very large compared to the average relaxation time () the quasispecies grows nearly without noticing the changes. Thus, in the quasistatic region all quasispecies that might be expected from the static landscapes will occur at some time during one cycle .
Wilke et al. raise in [18] the schematic phase diagram of the continuous Eigen model, which exhibits the same timeaverage phases as that for ssGAs. Their result is in perfect agreement with two recently, explicitly studied timedependent landscapes. First, Wilke et al. studied in [17] a needleinthehaystack (NiH) landscape with oscillating, periodic fitness of the needle, i. e.
The continuous model was represented for as Eq. 4 and the periodic quasispecies Eq. 11 was calculated. Figure 2 (left) shows the resulting phase diagram. For small , the error threshold is given by the one of the timeaveraged landscape, whereas for large , the error threshold oscillates between minimum and maximum values corresponding to and , as expected in the quasistatic regime. Second, Nilsson and Snoad studied in [19] a moving NiH that jumps randomly to one of its nearest neighbor strings every time steps. The timeaverage of this landscape over many jump cycles is a totally flat or neutral landscape, which explains the extension of the disordered phase to small and small as it is shown in Fig. 2 (right). In the quasistatic region, order is expected because the needle stays long enough at each position for a quasispecies to grow. Hence, we can understand the existence of the observed and calculated phase diagrams in Fig. 2 from simple arguments. In fact, they are special instances of the general schematic phase diagram depicted in Fig. 1.
In the following, we will consider regularly moving NiHs and derive the infinite population behavior of a genGA in such landscapes. This is interesting, since genGAs should be considered to adapt faster to changes compared to ssGAs, as the missing timeaverage region of genGAs for small suggests. To clarify whether a different phase diagram compared to Fig. 2 (right) emerges for genGAs with moving NiH, we will calculate the phase diagram including the optimal mutation rate that maximizes a lower bound for the concentration of the needle string in the population.
4 Generational Ga and a moving NiH
In this section, we want to analyze quantitatively the asymptotic behavior of a genGA with NiH that moves regularly in the sense of Section 2 to one of its nearest neighbors every time steps. At the end, we will also be able to comment on the case of a NiH that jumps randomly to one of its nearest neighbors.
A simple example of a NiH that moves regularly to nearest neighbors is shown in Fig. 3 (left). Each jump corresponds to a rotation of the fourdimensional hypercube along the axis, i. e. the lower two bits are rotated as shown in Fig. 3 (right). We will call the set of strings which is obtained by applying the same rule over and over to some initial string , the orbit of under . The period length of the orbit shown in Fig. 3 (left) originates from the rotation angle and hence is independent of the string length . The orbits of such rotations will always be restricted to only four different strings. For reasons that will become clear below, we are looking for regular movements of the needle that are not restricted to such a small subspace of the string space. Instead, the needle is supposed to move ‘straight away’ from previous positions in string space. Since a complete classification and analysis of all possible regular movements for given string length and jump distance is out of the scope of this work, we will simply give an example of a rule that generates such movements: the composition of a cyclic 1bit leftshift, which we denote by , and an exclusiveor with , which we denote by .
For string length , corresponds to a rotation along the axis as can be seen in Fig. 4. Moreover, the orbit of under is shown in Fig. 5 also for . For arbitrary string length , it is more difficult to visualize the action of and hence of . But, it is easily verified that starting from all zeros , the string with ones will be reached after exactly jumps. Moreover, the orbit of under has the period length . In the limit of long strings , this periodicity is broken because the needle never (i. e. after many jumps) returns to all zeros , but – as we have shown in Eq. 11 using Eq. 10 – there still exists an asymptotic quasispecies.
How does our simple GA behave with a NiH that moves according to ? In Fig. 6, two typical runs of a genGA with a NiH like that are depicted. The setting was kept fixed but two different mutation rates were chosen. In the case of Fig. 6 (right), the mutation rate is ‘too high’ to allow the population to track the movement. The concentration of the future needle string (solid line) cannot grow much within one jump cycle resulting in a decreasing initial condition (bullet) for the growth of the needle concentration (dotted line) in the next cycle. The population looses the peak – in this case after generations. It might happen that the population finds the needle again by chance (or better saying the moving needle jumps into the population), but the population will not be able to stably track the movement. Contrasting to that, the mutation rate was chosen to maximize the concentration of the future needle string at the end of each jump cycle (bullets) in Fig. 6 (left).
Since in that case, the best achievable initial condition is given to each jump cycle, the movement of the needle is tracked with the highest possible stability for the given setting . As can be expected from Fig. 6 and is affirmed by further experiments, the bullets keep on fluctuating around an average value for which is for the infinite population given by the quasispecies Eq. 11. In the following, we are going to model that system with some idealizations and we will calculate a lower boundary for this average value.
We adopt the viewpoint of permuting the concentration vector compatible to the movement of the needle as we have done implicitly in Fig. 6 and formally in the definition of in Eq. 10, but we drop the primes henceforth. The concentration of the needle string within jump cycle is denoted by and the concentration of the string the needle will move to with the th jump (i. e. the future needle string in jump cycle ) is denoted by . The initial cycle prior to which no jump has occurred is . Within a cycle, the time or generation is counted as phase . Two succeeding cycles are connected by the (approximated) rule of change
(14) 
The second relation is an approximation which is made to simplify the coming calculations, but it holds only if the needle jumps onto a string which has not been close to one of the previous needle positions. Otherwise, the future needle string could already be present with a concentration significantly larger than . In Fig. 6, we have chosen the rule to get experimental data for a case in which this assumption is fulfilled. Later on we will see that we can still make useful comments about cases in which that approximation is partly broken.
If we plot against , we get an intuitive picture for the system’s evolution towards the quasispecies. The concentration converges for towards a fixed point,
as shown in Fig. 7 for a finite value of . Obviously, this fixed point depends on the full setting . Since we are especially interested in the effects of various cycle lengths and mutation rates , we keep fixed, such that .
In the remaining of this section, we will calculate in dependence on , which is the solid curve in Fig. 7, for arbitrary parameter settings. From this knowledge, we will construct the phase diagram. Since we stay within one jump cycle, we drop to take off some notational load.
4.1 Derivation of the Fixed Point Concentrations
To calculate , it is sufficient to take only and into account, because the assumed initial condition is , such that the main growth of is produced by the mutational flow from the needle. Moreover, we assume to be small enough such that terms proportional to can be neglected. This means we restrict ourselves to the case in which the system is mainly driven by onebit mutations. Without normalization, the evolution equations then read
(15)  
where denote unnormalized concentrations in contrast to the normalized concentrations .
For , which is always the case for large enough , we can further neglect the backflow from the future needle string compared to the selfreplication of the current needle string. The solution of Eq. 15 is then given by
The coefficient measures the growth of starting from the initial condition . As long as , this gives already a good approximation for the concentrations and . But in general, this approximation breaks down for large , because of the exponential growth of . We need to normalize our solution, which can be done by
(16) 
By expressing the fitness averages in terms of , we find, after solving a simple recursion,
Finally, we arrive at the normalized concentrations
The asymptotic state can now be calculated by using the initial condition and demanding . It is easily verified that for the fixed point follows
(17) 
4.2 Consistency in the QuasiStatic Limit
How can we test the quality of the approximate result Eq. 17? For large cycle lengths , we enter the quasistatic regime, where we can approximate the population at the end of each cycle by the quasispecies of the corresponding static landscape. Figure 8 shows a comparison of the exact numerical calculations of the quasispecies () and the calculations (). In the numerical calculation, the backflow from the first error class to the needle string is included. Overall, we find the error threshold and the maximum of the fixed point concentration well represented. This also suggests that the deviation of the approximation from the exact values should be small for smaller , because those deviations add up for by the iterative procedure.
How do the calculated fixed point concentrations compare to simulations with (large) finite population? In Fig. 6, the values of and are shown. For , the deviation from the average (in generations ) is in fact the same as what can be read off in Fig. 8. The deviation of from the average value in generations is significantly larger. This is caused by the neglect of all other strings’ contributions apart from the current needle string’s contribution to the flow onto the future needle string. These neglected contributions increase the average fixed point concentration measured in the experiment in comparison to the calculated value . But even though there are deviations, we conclude that the approximately calculated value is always a lower bound for the exact value. In the next section, we will use this observation to derive an expression for the mutation rate that maximizes the average fixed point concentration.
4.3 Phase Diagram
In Fig. 9, the fixed point values are shown for small cycle lengths . For the shown parameter setting, the region with is extremely small. We notice that there are two error thresholds, one for ‘too low’ mutation rates, , and one for ‘too high’ mutation rates, . The intuition behind that was already given in Section 3. For too low mutation rates the population becomes slow and evolves in the averaged, flat landscape, whereas for too high mutation rates the usual transition to the disordered phase takes place. In the following we will calculate the phase diagram starting from Eq. 17.
Error Thresholds:
The error thresholds are given by
(18) 
This is the same condition as one would get using only unnormalized concentrations . Since near the error thresholds, the neglect of the normalization is not critical for the calculation of the error thresholds themselves, whereas it is important for the optimal mutation rate and of course for the fixed point concentration. Since Eq. 18 cannot be solved for in closed form, we write down the corresponding recursion relation that converges, for a suitable starting value of , to the solution of Eq. 18 in the limit ,
For , a good starting value is , since anyway. For , the approximate value for the error threshold of the static (i. e. ) landscape can be chosen, which is obtained by calculating the fixed point [using Eq. 15 and 16],
setting it to zero and solving for .
Optimal Mutation Rate:
In order to track changes with the best achievable stability for a given setting , the lowest possible concentration (infimum of) needs to be maximized, because a low concentration might result in the loss of the needle string in a finite population. Since for infinite populations is monotonously increasing with it is sufficient to maximize . Moreover, we derived above that approaches the fixed point value for . For finite populations, we expect similar behavior but the strict monotony of in will be destroyed by fluctuations and also the fixed point value itself will fluctuate around some average value as can be seen in Fig. 6. However, the safest way to avoid any loss of the needle string is still to maximize the average fixed point value . In this sense, we define the optimal mutation rate as the one that maximizes . In the previous Section 4.2, we noted that our approximated infinite population value represents a lower bound for , where the maxima of the two curves are expected to coincide for fixed . Thus, can be obtained by maximization of .
We can derive an expression for the optimal mutation rate from
If we neglect the dependence of in Eq. 17, which corresponds to the approach in [19], we simply find . Because of , this result is inconsistent with the quasistatic limit, because should approach the value for which the concentration of 1mutants in the quasispecies of the corresponding static NiH landscape is maximized. We conclude that the dependence of cannot be neglected for the correct optimal mutation rate, which we are now going to calculate.
For , which is the case for and , or and , we can neglect the in the numerator of and take only into account for the calculation of . After some algebra, we find
Since , this equation cannot be solved in a closed form for . However, for the equation simplifies to
In the case , we find
By approximating , we get a cubic equation. The real root of that equation is approximately [20] given by (see also Fig. 10)
(19) 
Resulting Phase Diagram:
From the above, we are able to plot the phase diagram for our model as shown in Fig. 11. Two settings are plotted. For the diamonds (resp. circles) are the numerically obtained error thresholds. The solid and dashdotted lines are and . To show the convergence property of , are plotted for as dashed lines. Obviously, the needed corrections to the chosen starting values increase for smaller , such that more iterations are needed to describe the error thresholds correctly for small . The expressions are already a good approximation for the given settings. Representing the quasistatic limit, is plotted as dotted line and gets consistently approached by for . Furthermore, is plotted as dashdotdotted line. The numerically measured values for are shown for as triangle (resp. squares). They approach very quickly already for .
We conclude that the above quantitative description is in good agreement with the numerical observations and approaches the quasistatic region in a consistent way. Moreover, the phase diagram fits well into the general one raised in Section 3. Even in the considered case of a genGA, we find – depending on the parameter setting – a timeaveraged phase for very small . The timeaveraged phase broadens for small .
4.4 Stochastically moving NiH
Up to now, we analyzed a regularly moving NiH, for example with the rule . What happens if the NiH is allowed to move to a randomly picked nearest neighbor, as it is shown in Fig. 12 for ?
Two typical runs of a genGA with this fitness landscape are depicted in Fig. 13. The setting was chosen the same as in Fig. 6 which allows for a direct comparison of the GA’s behavior for regularly and stochastically moving NiHs. The overall behavior is similar. For large mutation rates, the population looses the needle string, whereas the moving needle is tracked stably for mutation rates close to the above defined optimal mutation rate. In addition, strong fluctuations in the values of (lower ends of solid lines) as well as (bullets) occur in the stochastic case. These result from backjumps. If, at the end of the current cycle, the needle jumps back to the string it has been to in the previous cycle, then is significantly larger than zero. This can be seen in Fig. 13 (right) at generations and and also in Fig. 13 (left) at generations and (the gaps in Fig. 13 (left) correspond to being much larger than ). If no backjumps occur, as in generations in Fig. 13 (left), the system with stochastic NiH behaves nearly indistinguishable from the one with regularly moving NiH. Since backjumps always increase the concentrations of the needle string in the very next occurring jumps, the above calculated fixed point is still a lower bound. Thus, our previous notion of optimal mutation rate remains applicable to the stochastically moving NiH although the assumption from Eq. 14 is not always fulfilled.
Nilsson and Snoad [19] did their analysis of the continuous Eigen model Eq. 3 with stochastic NiH in a similar way as we did above. In analogy to their calculation for the continuous Eigen model, we find for a genGA the optimal mutation rate which is inconsistent with the quasistatic limit (see Section 4.3). The reason is the missing normalization in the work of Nilsson and Snoad. Furthermore, they could not derive an expression for the fixed point concentration because of that same reason.
4.5 Jumps of larger Distance
To conclude this section about the behavior of genGAs with different kinds of NiHs that move to nearest neighbors, let us shortly discuss jumps of Hamming distance larger than one. Obviously, the analytical calculations get more complicated, because the approximation is not sufficient anymore as it connects only nearest neighbors. To describe jumps of a larger distance, the concentrations of some intermediate sequences need to be taken into account, so that we have to solve a time evolution much more complicated than Eq. 15. Hence, we cannot make simple statements for finite . On the other hand, the system approaches the quasistatic region for large and it is characterized by and as we have seen in Fig. 11.
The exact quasispecies for is shown in Fig. 14. The plotted values are error class concentrations, in order to make the higher error classes visible at all. Each mutant has a concentration of in the quasispecies state, because for a NiH the mutant’s fitness depends only on its Hamming distance to the needle and therefore all mutants have the same concentration in the quasispecies. For finite populations, this is only true on average, because the asymptotic state is distorted by fluctuations. But in the following, we assume that the quasispecies is still representative for the average distribution of the population in the asymptotic state. Then, the optimal mutation rate in the sense of Section 4.3 for jumps of distance is by definition the position of the maximum of . For , optimal mutation rate and error threshold become identical. Although is maximized for mutation rates close to the error threshold it amounts, as do all other concentrations to only , which leads to an approximately random drift for finite populations. On the other hand, the chance of tracking the needle decreases even further for small mutation rates because then the concentration becomes even smaller. In this sense, the quasispecies distribution, which is centered on the needle string, is useless for tracking the next jump if . This also suggests – in agreement with the experimental findings of Rowe [13] (in this book) – that finite populations are for low mutation rates unable to track large jumps – in particular in the extreme case . Only for jumps of the corresponding error class concentration shows a concentration maximum significantly above . From the heights of the concentration maxima, we see that the difficulty of tracking the changes increases with the Hamming distance of the jumps. Vice versa, the advantage a population gets after a jump from its structure prior to the jump decreases with increasing jump distance . In addition, a mutation rate which is simultaneously optimal for more than one distance cannot be found.
5 Conclusions and Future Work
On the basis of general arguments, the phase diagrams of populationbased mutation and probabilistic selection systems like the above genGA, ssGA and Eigen model in timedependent fitness landscape can be easily understood. The notion of regular changes allows for an exact calculation of the asymptotic state in the sense of a generalized, timedependent quasispecies. For a genGA with NiH that moves regularly to nearest neighbors, the quasispecies can be straightforwardly calculated under simplifying assumptions. The result is a lower bound for the exact quasispecies. With that lower bound, we have constructed the phase diagram in the infinite population limit. This phase diagram is in agreement with the one raised from general arguments.
In order to improve our analysis, we need to weaken our assumptions. In particular, we have to overcome the restriction of taking into account only the flow from the current towards the future needle string. The presence of other contributions to the flow has to be modeled in some way. Another future step could be an investigation of the fluctuations that are introduced by the finiteness of realistic populations (discreteness of ) around the quasispecies. This would lead to a lower boundary for the population size above which the needle string is not lost due to those fluctuations.
An extension of our analysis to nonregularities like the occurrence of more than a single jump rule, can be achieved by averaging the time evolution Eq. 8 for according to each rule’s probability of being applied. A similar averaging procedure will be necessary if fluctuations of the cycle length are present. Finally, an extension of the description to broader, more realistic peaks, as well as GA models including crossover and other selection schemes, are important topics for future work.
References
 [1] T. Bäck, U. Hammel and H.P. Schwefel. Evolutionary Computation: Comments on the History and Current State. IEEE Transactions on Evol. Comp. 1(1), p. 3, 1997.
 [2] T. Bäck, D. B. Fogel and Z. Michalewicz, editors. Handbook of Evolutionary Computation. IOP Publishing, Bristol, 1997.
 [3] J. Branke. Evolutionary Algorithms for Dynamic Optimization Problems, A Survey. Technical Report 387, AIFB University Karlsruhe, 1999.
 [4] J. E. Rowe. Finding attractors for periodic fitness functions. In W. Banzhaf et al., editors, Proceedings to GECCO 1999, Morgan Kaufmann, San Mateo, p. 557, 1999.
 [5] L. M. Schmitt, C. L. Nehaniv and R. H. Fujii. Linear analysis of genetic algorithms. Theoretical Computer Science 200, p. 101, 1998.
 [6] M. Eigen. Selforganization of matter and the evolution of biological macromolecules. Naturwissenschaften 58, p. 465, 1971.
 [7] M. Eigen and P. Schuster. The Hypercycle – A Principle of Natural SelfOrganization. SpringerVerlag, Berlin, 1979.
 [8] M. Eigen, J. McCaskill and P. Schuster. The molecular quasispecies. Adv. Chem. Phys. 75, p. 149, 1989.
 [9] E. Baake and W. Gabriel. Biological evolution through mutation, selection, and drift: An introductory review. Ann. Rev. Comp. Phys. 7, in press, 1999.
 [10] J. E. Rowe. The dynamical systems model of the simple Genetic Algorithm. this issue, p. XXX, 1999.
 [11] M. D. Vose. The simple Genetic Algorithm – Foundations and Theory. MIT Press, Cambridge, 1999.
 [12] E. van Nimwegen, J. P. Crutchfield and M. Mitchell. Statistical Dynamics of the RoyalRoad genetic algorithms. Theoretical Computer Science, special issue on Evolutionary Computation, A. Eiben, G. Rudolph, editors, in press, 1998.
 [13] J. E. Rowe. Cyclic Attractors and Quasispecies Adaptability. this issue, p. XXX, 1999.
 [14] K. DeJong and J. Sarma. Generation Gaps Revisited. In L. D. Whitley, editor, Foundations of Genetic Algorithms 2, Morgan Kaufmann, San Mateo, p. 19, 1993.
 [15] A. Rogers and A. PrügelBennett. Modeling the Dynamics of a Steady State Genetic Algorithm. In W. Banzhaf and C. Reeves, editors, Foundations of Genetic Algorithms 5, Morgan Kaufmann, San Mateo, p. 57, 1998.
 [16] J. Branke, M. Cutaia and H. Dold. Reducing Genetic Drift in Steady State Evolutionary Algorithms. In W. Banzhaf et al., editors, Proceedings to GECCO 1999, Morgan Kaufmann, San Mateo, p. 68, 1999.
 [17] C. O. Wilke, C. Ronnewinkel and T. Martinetz. Molecular Evolution in timedependent Environments. In D. Floreano, J.D. Nicoud and F. Mondada, editors, Proceedings to European Conference on Artificial Life 1999, Springer, Berlin, p. 417, 1999.
 [18] C. O. Wilke and C. Ronnewinkel. Dynamic Fitness landscapes in the Quasispecies model. in preparation.
 [19] M. Nilsson and N. Snoad. Error Thresholds on dynamic FitnessLandscapes. Working Paper 9904030, Santa Fe Institute, 1999.
 [20] A more detailed explanation and analysis of the used approximation will be presented elsewhere.
Comments
There are no comments yet.