The Fundamental Theorem of Natural Selection

07/12/2021 ∙ by John C. Baez, et al. ∙ 0

Suppose we have n different types of self-replicating entity, with the population P_i of the ith type changing at a rate equal to P_i times the fitness f_i of that type. Suppose the fitness f_i is any continuous function of all the populations P_1, …, P_n. Let p_i be the fraction of replicators that are of the ith type. Then p = (p_1, …, p_n) is a time-dependent probability distribution, and we prove that its speed as measured by the Fisher information metric equals the variance in fitness. In rough terms, this says that the speed at which information is updated through natural selection equals the variance in fitness. This result can be seen as a modified version of Fisher's fundamental theorem of natural selection. We compare it to Fisher's original result as interpreted by Price, Ewens and Edwards.

READ FULL TEXT VIEW PDF
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.

1. Introduction

In 1930, Fisher [10] stated his “fundamental theorem of natural selection” as follows:

The rate of increase in fitness of any organism at any time is equal to its genetic variance in fitness at that time.

Some tried to make this statement precise as follows:

The time derivative of the mean fitness of a population equals the variance of its fitness.

But this is only true under very restrictive conditions, so a controversy was ignited.

An interesting resolution was proposed by Price [14], and later amplified by Ewens [8] and Edwards [7]. We can formalize their idea as follows. Suppose we have types of self-replicating entity, and idealize the population of the th type as a real-valued function . Suppose

where the fitness is a differentiable function of the populations of every type of replicator. The mean fitness at time is

where is the fraction of replicators of the th type:

By the product rule, the rate of change of the mean fitness is the sum of two terms:

The first of these two terms equals the variance of the fitness at time . We give the easy proof in Theorem 1. Unfortunately, the conceptual significance of this first term is much less clear than that of the total rate of change of mean fitness. Ewens concluded that “the theorem does not provide the substantial biological statement that Fisher claimed”.

But there is another way out, based on an idea Fisher himself introduced in 1922: Fisher information [9]

. Fisher information gives rise to a Riemannian metric on the space of probability distributions on a finite set, called the ‘Fisher information metric’—or in the context of evolutionary game theory, the ‘Shahshahani metric’

[1, 2, 15]. Using this metric we can define the speed at which a time-dependent probability distribution changes with time. We call this its ‘Fisher speed’. Under just the assumptions already stated, we prove in Theorem 2 that the Fisher speed of the probability distribution

is the variance of the fitness at time .

As explained by Harper [11, 12], natural selection can be thought of as a learning process, and studied using ideas from information geometry [3]—that is, the geometry of the space of probability distributions. As changes with time, the rate at which information is updated is closely connected to its Fisher speed. Thus, our revised version of the fundamental theorem of natural selection can be loosely stated as follows:

As a population changes with time, the rate at which information is updated equals the variance of fitness.

The precise statement, with all the hypotheses, is in Theorem 2. But one lesson is this: variance in fitness may not cause ‘progress’ in the sense of increased mean fitness, but it does cause change.

2. The time derivative of mean fitness

Suppose we have different types of entity, which we call replicators. Let or for short, be the population of the th type of replicator at time , which we idealize as taking real values. Then a very general form of the Lotka–Volterra equations says that

(1)

where is the fitness function of the th type of replicator. One might also consider fitness functions with explicit time dependence, but we do not do so here.

Let , or for short, be the probability at time that a randomly chosen replicator will be of the th type. More precisely, this is the fraction of replicators of the th type:

(2)

Using these probabilities we can define the mean fitness by

(3)

and the variance in fitness by

(4)

These quantities are also functions of , but we suppress the dependence in our notation.

Fisher said that the variance in fitness equals the rate of change of mean fitness. Price [14], Ewens [8] and Edwards [7] argued that Fisher only meant to equate part of the rate of change in mean fitness to the variance in fitness. We can see this in the present context as follows. The time derivative of the mean fitness is the sum of two terms:

and as we now show, the first term equals the variance in fitness. The second term only vanishes in special cases, e.g. when the fitness functions are constant.

Theorem 1.

Suppose the functions obey the Lotka–Volterra equations for some continuous functions . Then

Proof.

First we recall a standard formula for the time derivative . Using the definition of in equation (2), the quotient rule gives

where all sums are from to . Using the Lotka–Volterra equations this becomes

where we write to mean , and similarly for . Using the definition of again, this simplifies to:

and thanks to the definition of mean fitness in equation (3), this reduces to the well-known replicator equation:

(5)

Now, the replicator equation implies

(6)

On the other hand,

(7)

since but also . Subtracting equation (7) from equation (6) we obtain

or simply

3. The Fisher speed

While Theorem 1 allows us to express the variance in fitness in terms of the time derivatives of the probabilities , it does so in a way that also explicitly involves the fitness functions . We now prove a simpler formula for the variance in fitness, which equates it with the square of the ‘Fisher speed’ of the probability distribution .

The space of probability distributions on the set is the -simplex

The Fisher metric is the Riemannian metric on the interior of the -simplex such that given a point in the interior of

and two tangent vectors

we have

Here we are describing the tangent vectors as vectors in with the property that the sum of their components is zero: this makes them tangent to the -simplex. We are demanding that be in the interior of the simplex to avoid dividing by zero, since on the boundary of the simplex we have for at least one choice of .

If we have a time-dependent probability distribution moving in the interior of the -simplex as a function of time, its Fisher speed is defined by

if the derivative exists. This is the usual formula for the speed of a curve moving in a Riemannian manifold, specialized to the case at hand.

These are all the formulas needed to prove our result. But for readers unfamiliar with the Fisher metric, a few words may provide some intuition. The factor of in the Fisher metric changes the geometry of the simplex so that it becomes round, with the geometry of a portion of a sphere in . But more relevant here is the Fisher metric’s connection to relative information—a generalization of Shannon information that depends on two probability distributions rather than just one [6]. Given probability distributions , the information of relative to is

This is the amount of information that has been updated if one replaces the prior distribution with the posterior

. So, sometimes relative information is called the ‘information gain’. It is also called ‘relative entropy’ or ‘Kullback–Leibler divergence’. It has many applications to biology

[5, 11, 12, 13].

Suppose is a smooth curve in the interior of the -simplex. We can ask the rate at which information is being updated as time passes. Perhaps surprisingly, an easy calculation gives

Thus, to first order, information is not being updated at all at any time However, another well-known calculation (see e.g. [4]) shows that

So, to second order in , the square of the Fisher speed determines how much information is updated when we pass from to .

Theorem 2.

Suppose the functions obey the Lotka–Volterra equations for some continuous functions . If none of the are zero, then the square of the Fisher speed of the probability distribution is the variance of the fitness:

Proof.

Consider the square of the Fisher speed

and use the replicator equation

obtaining

as desired. ∎

The generality of this result is remarkable. Formally, any autonomous system of first-order differential equations

can be rewritten as Lotka–Volterra equations

simply by setting

In general is undefined when , but this not a problem if we restrict ourselves to situations where all the populations are positive; in these situations Theorems 1 and 2 apply.

Acknowledgments

This work was supported by the Topos Institute. I thank Marc Harper for his invaluable continued help with this subject, and evolutionary game theory more generally.

References